diff --git a/_config.yml b/_config.yml index 9008050f..ce67b2a0 100755 --- a/_config.yml +++ b/_config.yml @@ -23,11 +23,7 @@ html: use_repository_button: true use_issues_button: true use_edit_page_button: true - announcement: | -
- 🚧 Scheduled maintenance: the OGGM cluster will be offline April 27 (evening CEST) – April 30 (morning CEST) 2025. - Learn more. -
+ announcement: extra_footer: |These notebooks are licensed under a BSD-3-Clause license. diff --git a/notebooks/10minutes/dynamical_spinup.ipynb b/notebooks/10minutes/dynamical_spinup.ipynb index 73096043..68749625 100644 --- a/notebooks/10minutes/dynamical_spinup.ipynb +++ b/notebooks/10minutes/dynamical_spinup.ipynb @@ -35,8 +35,16 @@ "\n", "# Locals\n", "import oggm.cfg as cfg\n", - "from oggm import utils, workflow, tasks, DEFAULT_BASE_URL\n", - "from oggm.shop import gcm_climate" + "from oggm import utils, workflow, DEFAULT_BASE_URL" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "DEFAULT_BASE_URL" ] }, { @@ -250,9 +258,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This is not really visible in the plots above, but the \"old\" method of initialisation in OGGM had another issue. It assumed dynamical steady state at the begining of the simulation (the RGI date), which was required by the bed inversion process. This could lead to artifacts (mainly in the glacier length and area, as well as velocities) during the first few years of the simulation. The dynamical spinup addresses this issue by starting the simulation in 1980. \n", + "This is not really visible in the plots above, but the \"old\" method of initialisation in OGGM had another issue. It assumed a dynamical steady state at the beginning of the simulation (the RGI date), which was required by the bed inversion process. This could lead to artifacts (mainly in the glacier length and area, as well as velocities) during the first few years of the simulation. The dynamical spinup addresses this issue by starting the simulation in 1980.\n", "\n", - "One of the way to see the importance of the spinup is to have a look at glacier velocities. Let's plot glacier volocities along the flowline in the year 2005 (the first year we have velocities from both the dynamical spinup, and without the spinup (\"cold start\" from an equilibrium):" + "One of the way to see the importance of the spinup is to have a look at glacier velocities. Let's plot glacier velocities along the flowline in the year 2005 (the first year we have velocities from both the dynamical spinup, and without the spinup (\"cold start\" from an equilibrium):" ] }, { @@ -262,15 +270,15 @@ "outputs": [], "source": [ "f = gdir.get_filepath('fl_diagnostics', filesuffix='_historical')\n", - "with xr.open_dataset(f, group=f'fl_0') as dg:\n", + "with xr.open_dataset(f, group='fl_0') as dg:\n", " dgno = dg.load()\n", "f = gdir.get_filepath('fl_diagnostics', filesuffix='_spinup_historical')\n", - "with xr.open_dataset(f, group=f'fl_0') as dg:\n", + "with xr.open_dataset(f, group='fl_0') as dg:\n", " dgspin = dg.load()\n", "\n", "year = 2005\n", - "dgno.ice_velocity_myr.sel(time=year).plot(label='No spinup');\n", - "dgspin.ice_velocity_myr.sel(time=year).plot(label='With spinup');\n", + "dgno.ice_velocity_myr.sel(time=year).plot(label='No spinup')\n", + "dgspin.ice_velocity_myr.sel(time=year).plot(label='With spinup')\n", "plt.title(f'Velocity along the flowline at year {year}'); plt.legend();" ] }, @@ -305,6 +313,13 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/10minutes/machine_learning.ipynb b/notebooks/10minutes/machine_learning.ipynb index 07fe5cfa..0e7461b4 100644 --- a/notebooks/10minutes/machine_learning.ipynb +++ b/notebooks/10minutes/machine_learning.ipynb @@ -13,7 +13,7 @@ "source": [ "In this notebook, we want to showcase what OGGM does best: **preparing data for your modelling workflow**.\n", "\n", - "We use preprocessed directories which contain most data available in [the OGGM shop](https://docs.oggm.org/en/stable/input-data.html) to illustrate how these could be used to inform data-based workflows. The data that is available in the shop and is show cased here, is more than is required for the regular OGGM workflow, which you will see in a bit." + "We use preprocessed directories which contain most data available in [the OGGM shop](https://docs.oggm.org/en/stable/input-data.html) to illustrate how these could be used to inform data-based workflows. The data that is available in the shop and is showcased here, is more than is required for the regular OGGM workflow, which you will see in a bit." ] }, { @@ -70,8 +70,9 @@ "cfg.PARAMS['use_multiprocessing'] = False\n", "# Local working directory (where OGGM will write its output)\n", "cfg.PATHS['working_dir'] = utils.gettempdir('OGGM_Toy_Thickness_Model')\n", - "# We use the preprocessed directories with additional data in it: \"W5E5_w_data\" \n", - "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/elev_bands/W5E5_w_data/'\n", + "# We use the preprocessed directories with additional data in it: \"W5E5_w_data\"\n", + "# the old 2023.3 preprocessed gdir had \"0\"-values instead of NaN-values for the Millan2022 data. This was corrected in the 2025.6 gdirs.\n", + "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/elev_bands_w_data/W5E5/per_glacier/'\n", "gdirs = workflow.init_glacier_directories(['RGI60-01.16195'], from_prepro_level=3, prepro_base_url=base_url, prepro_border=10)" ] }, @@ -116,6 +117,13 @@ "ds" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are a few NaN values, we will remove those later for the machine learning part." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -147,7 +155,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Lets start with the [ITSLIVE](https://its-live.jpl.nasa.gov/#data) velocity data: " + "Let's start with the [ITSLIVE](https://its-live.jpl.nasa.gov/#data) velocity data:" ] }, { @@ -390,7 +398,7 @@ }, "outputs": [], "source": [ - "ds.slope.plot();\n", + "ds.slope.plot()\n", "plt.axis('equal');" ] }, @@ -422,7 +430,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Not convinced yet? Lets spend 10 more minutes to apply a (very simple) machine learning workflow " + "## Not convinced yet? Let's spend 10 more minutes to apply a (very simple) machine learning workflow" ] }, { @@ -451,7 +459,9 @@ "coords = np.array([p.xy for p in df.geometry]).squeeze()\n", "df['lon'] = coords[:, 0]\n", "df['lat'] = coords[:, 1]\n", - "df = df[['lon', 'lat', 'thick']]" + "df = df[['lon', 'lat', 'thick']]\n", + "# check that there are no NaN values in the data (otherwise, we would remove them)\n", + "assert np.any(~df.isna())" ] }, { @@ -491,7 +501,7 @@ "source": [ "geom = gdir.read_shapefile('outlines')\n", "f, ax = plt.subplots()\n", - "df.plot.scatter(x='x', y='y', c='thick', cmap='viridis', s=10, ax=ax);\n", + "df.plot.scatter(x='x', y='y', c='thick', cmap='viridis', s=10, ax=ax)\n", "geom.plot(ax=ax, facecolor='none', edgecolor='k');" ] }, @@ -524,7 +534,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Here, we will keep them all and interpolate the variables of interest at the point's location. We use [xarray](http://xarray.pydata.org/en/stable/interpolation.html#advanced-interpolation) for this:" + "Here, we will keep them all and interpolate the variables of interest at the point's location. We use [xarray](https://xarray.pydata.org/en/stable/interpolation.html#advanced-interpolation) for this:" ] }, { @@ -561,6 +571,32 @@ " df[vn] = ds[vn].interp(x=('z', df.x), y=('z', df.y))" ] }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# there are a few rows without millan velocities\n", + "df[df.isna().any(axis=1)]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**We remove those rows with NaN values (inside of Millan velocities) to have a fair comparison**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "df = df.dropna()" + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -575,8 +611,8 @@ "outputs": [], "source": [ "f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(16, 5))\n", - "df.plot.scatter(x='dis_from_border', y='thick', ax=ax1); ax1.set_title('dis_from_border');\n", - "df.plot.scatter(x='slope', y='thick', ax=ax2); ax2.set_title('slope');\n", + "df.plot.scatter(x='dis_from_border', y='thick', ax=ax1); ax1.set_title('dis_from_border')\n", + "df.plot.scatter(x='slope', y='thick', ax=ax2); ax2.set_title('slope')\n", "df.plot.scatter(x='oggm_mb_above_z', y='thick', ax=ax3); ax3.set_title('oggm_mb_above_z');" ] }, @@ -584,7 +620,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There is a negative correlation with slope (as expected), a positive correlation with the mass-flux (oggm_mb_above_z), and a slight positive correlation with the distance from the glacier boundaries. There is also some correlaction with ice velocity, but not a strong one:" + "There is a negative correlation with slope (as expected), a positive correlation with the mass-flux (oggm_mb_above_z), and a slight positive correlation with the distance from the glacier boundaries. There is also some correlation with ice velocity, but not a strong one:" ] }, { @@ -596,7 +632,7 @@ "outputs": [], "source": [ "f, (ax1, ax2) = plt.subplots(1, 2, figsize=(12, 5))\n", - "df.plot.scatter(x='millan_v', y='thick', ax=ax1); ax1.set_title('millan_v');\n", + "df.plot.scatter(x='millan_v', y='thick', ax=ax1); ax1.set_title('millan_v')\n", "df.plot.scatter(x='itslive_v', y='thick', ax=ax2); ax2.set_title('itslive_v');" ] }, @@ -611,7 +647,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There are so many points that much of the information obtained by OGGM is interpolated and therefore not biring much new information to a statistical model. A way to deal with this is to aggregate all the measurement points per grid point and to average them. Let's do this: " + "There are so many points that much of the information obtained by OGGM is interpolated and is therefore not bringing much new information to a statistical model. A way to deal with this is to aggregate all the measurement points per grid point and to average them. Let's do this:" ] }, { @@ -662,7 +698,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We now have 9 times less points, but the main features of the data remain unchanged:" + "We now have 9 times fewer points, but the main features of the data remain unchanged:" ] }, { @@ -674,8 +710,8 @@ "outputs": [], "source": [ "f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(16, 5))\n", - "df_agg.plot.scatter(x='dis_from_border', y='thick', ax=ax1);\n", - "df_agg.plot.scatter(x='slope', y='thick', ax=ax2);\n", + "df_agg.plot.scatter(x='dis_from_border', y='thick', ax=ax1)\n", + "df_agg.plot.scatter(x='slope', y='thick', ax=ax2)\n", "df_agg.plot.scatter(x='oggm_mb_above_z', y='thick', ax=ax3);" ] }, @@ -702,7 +738,7 @@ "outputs": [], "source": [ "import seaborn as sns\n", - "plt.figure(figsize=(10, 8));\n", + "plt.figure(figsize=(10, 8))\n", "sns.heatmap(df.corr(), cmap='RdBu');" ] }, @@ -780,9 +816,9 @@ "source": [ "odf = df.copy()\n", "odf['thick_predicted'] = lasso_cv.predict(data.values)\n", - "f, ax = plt.subplots(figsize=(6, 6));\n", - "odf.plot.scatter(x='thick', y='thick_predicted', ax=ax);\n", - "plt.xlim([-25, 220]);\n", + "f, ax = plt.subplots(figsize=(6, 6))\n", + "odf.plot.scatter(x='thick', y='thick_predicted', ax=ax)\n", + "plt.xlim([-25, 220])\n", "plt.ylim([-25, 220]);" ] }, @@ -848,7 +884,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The fact that the hyper-parameter alpha and the score change that much between iterations is a sign that the model isn't very robust." + "The fact that the hyperparameter alpha and the score change that much between iterations is a sign that the model isn't very robust." ] }, { @@ -879,7 +915,9 @@ "# Generate our dataset\n", "pred_data = pd.DataFrame()\n", "for vn in data.columns:\n", - " pred_data[vn] = ds[vn].data[ds.glacier_mask == 1]\n", + " # only take glacier gridpoints\n", + " # and only take those \"gridpoints\" where millan velocities is not NaN\n", + " pred_data[vn] = ds[vn].data[(ds.glacier_mask == 1) & (~np.isnan(ds.millan_v))]\n", "\n", "# Normalize using the same normalization constants\n", "pred_data = (pred_data - data_mean) / data_std\n", @@ -899,7 +937,7 @@ "source": [ "# Back to 2d and in xarray\n", "var = ds[vn].data * np.nan\n", - "var[ds.glacier_mask == 1] = pred_data['thick']\n", + "var[(ds.glacier_mask == 1) & (~np.isnan(ds.millan_v))] = pred_data['thick']\n", "ds['linear_model_thick'] = (('y', 'x'), var)\n", "ds['linear_model_thick'].attrs['description'] = 'Predicted thickness'\n", "ds['linear_model_thick'].attrs['units'] = 'm'\n", @@ -922,7 +960,7 @@ "- we used two methods to extract these data at point locations: with interpolation or with aggregated averages on each grid point\n", "- as an application example, we trained a linear regression model to predict the ice thickness of this glacier at unseen locations\n", "\n", - "The model we developed was quite bad and we used quite lousy statistics. If I had more time to make it better, I would:\n", + "The model we developed was quite bad, and we used quite lousy statistics. If I had more time to make it better, I would:\n", "- make a pre-selection of meaningful predictors to avoid discontinuities\n", "- use a non-linear model\n", "- use cross-validation to better asses the true skill of the model\n", @@ -956,7 +994,7 @@ }, "outputs": [], "source": [ - "# Write our thinckness estimates back to disk\n", + "# Write our thickness estimates back to disk\n", "ds.to_netcdf(gdir.get_filepath('gridded_data'))\n", "# Distribute OGGM thickness using default values only\n", "workflow.execute_entity_task(tasks.distribute_thickness_per_altitude, gdirs);" @@ -989,17 +1027,34 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "tags": [] - }, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ - "f, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize=(12, 10));\n", - "ds['linear_model_thick'].plot(ax=ax1); ax1.set_title('Statistical model');\n", - "ds['distributed_thickness'].plot(ax=ax2); ax2.set_title('OGGM');\n", - "ds['millan_ice_thickness'].where(ds.glacier_mask).plot(ax=ax3); ax3.set_title('Millan 2022');\n", - "ds['consensus_ice_thickness'].plot(ax=ax4); ax4.set_title('Farinotti 2019');\n", - "plt.tight_layout();" + "f, ((ax1, ax2), (ax3, ax4)) = plt.subplots(\n", + " 2, 2, figsize=(12, 10), constrained_layout=True\n", + ")\n", + "vmin= 0\n", + "vmax = ds[['linear_model_thick','distributed_thickness','millan_ice_thickness','consensus_ice_thickness']].to_dataframe().max().max()*1.05\n", + "\n", + "\n", + "im1 = ds['linear_model_thick'].plot(ax=ax1, vmin=vmin, vmax=vmax, add_colorbar=False)\n", + "ds['distributed_thickness'].plot(ax=ax2, vmin=vmin, vmax=vmax, add_colorbar=False)\n", + "ds['millan_ice_thickness'].where(ds.glacier_mask).plot(ax=ax3, vmin=vmin, vmax=vmax, add_colorbar=False)\n", + "ds['consensus_ice_thickness'].plot(ax=ax4, vmin=vmin, vmax=vmax, add_colorbar=False)\n", + "\n", + "ax1.set_title('Statistical model')\n", + "ax2.set_title('OGGM')\n", + "ax3.set_title('Millan 2022')\n", + "ax4.set_title('Farinotti 2019')\n", + "cbar = f.colorbar(im1, ax=[ax1, ax2, ax3, ax4], shrink=0.8, location='right', pad=0.02)\n", + "cbar.set_label(\"Ice thickness (m)\");\n" ] }, { @@ -1010,16 +1065,26 @@ }, "outputs": [], "source": [ - "f, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize=(12, 10));\n", - "df_agg.plot.scatter(x='thick', y='linear_model_thick', ax=ax1);\n", - "ax1.set_xlim([-25, 220]); ax1.set_ylim([-25, 220]); ax1.set_title('Statistical model');\n", - "df_agg.plot.scatter(x='thick', y='oggm_thick', ax=ax2);\n", - "ax2.set_xlim([-25, 220]); ax2.set_ylim([-25, 220]); ax2.set_title('OGGM');\n", - "df_agg.plot.scatter(x='thick', y='millan_thick', ax=ax3);\n", - "ax3.set_xlim([-25, 220]); ax3.set_ylim([-25, 220]); ax3.set_title('Millan 2022');\n", - "df_agg.plot.scatter(x='thick', y='consensus_thick', ax=ax4);\n", + "## check if there are no NaN values\n", + "assert np.any(~np.isnan(df_agg))\n", + "###\n", + "f, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize=(12, 10))\n", + "df_agg.plot.scatter(x='thick', y='linear_model_thick', ax=ax1)\n", + "ax1.set_xlim([-25, 220]); ax1.set_ylim([-25, 220]); ax1.set_title('Statistical model')\n", + "df_agg.plot.scatter(x='thick', y='oggm_thick', ax=ax2)\n", + "ax2.set_xlim([-25, 220]); ax2.set_ylim([-25, 220]); ax2.set_title('OGGM')\n", + "df_agg.plot.scatter(x='thick', y='millan_thick', ax=ax3)\n", + "ax3.set_xlim([-25, 220]); ax3.set_ylim([-25, 220]); ax3.set_title('Millan 2022')\n", + "df_agg.plot.scatter(x='thick', y='consensus_thick', ax=ax4)\n", "ax4.set_xlim([-25, 220]); ax4.set_ylim([-25, 220]); ax4.set_title('Farinotti 2019');" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/10minutes/preprocessed_directories.ipynb b/notebooks/10minutes/preprocessed_directories.ipynb index 0b4c1937..5b77c636 100644 --- a/notebooks/10minutes/preprocessed_directories.ipynb +++ b/notebooks/10minutes/preprocessed_directories.ipynb @@ -192,7 +192,7 @@ "- `RGI60-11.01450`: [Aletsch Glacier](https://en.wikipedia.org/wiki/Aletsch_Glacier) in the Swiss Alps\n", "\n", "Here is a list of other glaciers you might want to try out:\n", - "- `RGI60-11.00897`: [Hintereisferner](http://acinn.uibk.ac.at/research/ice-and-climate/projects/hintereisferner) in the Austrian Alps.\n", + "- `RGI60-11.00897`: [Hintereisferner](https://www.uibk.ac.at/en/acinn/research/ice-and-climate/projects/hintereisferner/) in the Austrian Alps.\n", "- `RGI60-18.02342`: [Tasman Glacier](https://en.wikipedia.org/wiki/Tasman_Glacier) in New Zealand\n", "- `RGI60-11.00787`: [Kesselwandferner](https://de.wikipedia.org/wiki/Kesselwandferner) in the Austrian Alps\n", "- ... or any other glacier identifier! You can find other glacier identifiers by exploring the [GLIMS viewer](https://www.glims.org/maps/glims). See the [working with the RGI](../tutorials/working_with_rgi.ipynb) tutorial for an introduction on RGI IDs and the GLIMS browser.\n", @@ -215,7 +215,7 @@ "\n", "To handle this situation, OGGM uses a workflow based on data persistence on disk: instead of passing data as python variables from one task to another, each task will read the data from disk and then write the computation results back to the disk, making these new data available for the next task in the queue. These glacier specific data are located in [glacier directories](https://docs.oggm.org/en/stable/generated/oggm.GlacierDirectory.html#oggm.GlacierDirectory). \n", "\n", - "One main advantage of this workflow is that OGGM can prepare data and make it available to everyone! Here is an example of an url where such data can be found:" + "One main advantage of this workflow is that OGGM can prepare data and make it available to everyone! Here is an example of a url where such data can be found:" ] }, { @@ -258,7 +258,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "- the keyword `from_prepro_level` indicates that we will start from [pre-processed directories](https://docs.oggm.org/en/stable/shop.html#pre-processed-directories), i.e. data that are already prepared by the OGGM team. In many cases you will want to start from pre-processed directories, in most cases from level 3 or 5. For level 3 and above the model has already been calibrated, so you no longer need to do that yourself and can start rigth away with your simulation. Here we start from level 4 and add some data to the processing in order to demonstrate the OGGM workflow.\n", + "- the keyword `from_prepro_level` indicates that we will start from [pre-processed directories](https://docs.oggm.org/en/stable/shop.html#pre-processed-directories), i.e. data that are already prepared by the OGGM team. In many cases you will want to start from pre-processed directories, in most cases from level 3 or 5. For level 3 and above the model has already been calibrated, so you no longer need to do that yourself and can start right away with your simulation. Here we start from level 4 and add some data to the processing in order to demonstrate the OGGM workflow.\n", "- the `prepro_border` keyword indicates the number of grid points which we'd like to add to each side of the glacier for the local map: the larger the glacier will grow, the larger the border parameter should be. The available pre-processed border values are: **10, 80, 160, 240** (depending on the model set-ups there might be more or less options). These are the fixed map sizes we prepared for you - any other map size will require a full processing (see the [further DEM sources example](../tutorials/dem_sources.ipynb) for a tutorial)." ] }, @@ -384,7 +384,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Glacier directories are the central object for model users and developpers to access data for this glacier. Let's say for example that you would like to retrieve the climate data that we have prepared for you. You can ask the glacier directory to tell you where this data is:" + "Glacier directories are the central object for model users and developers to access data for this glacier. Let's say for example that you would like to retrieve the climate data that we have prepared for you. You can ask the glacier directory to tell you where this data is:" ] }, { @@ -419,8 +419,8 @@ "with xr.open_dataset(gdir.get_filepath('climate_historical')) as ds:\n", " ds = ds.load()\n", "# Plot the data\n", - "ds.temp.resample(time='YS').mean().plot(label=f'Annual temperature at {int(ds.ref_hgt)}m a.s.l.');\n", - "ds.temp.resample(time='YS').mean().rolling(time=31, center=True, min_periods=15).mean().plot(label='30yr average');\n", + "ds.temp.resample(time='YS').mean().plot(label=f'Annual temperature at {int(ds.ref_hgt)}m a.s.l.')\n", + "ds.temp.resample(time='YS').mean().rolling(time=31, center=True, min_periods=15).mean().plot(label='30yr average')\n", "plt.legend();" ] }, @@ -461,10 +461,10 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There are two different types of \"[tasks](http://docs.oggm.org/en/stable/api.html#entity-tasks)\" in OGGM:\n", + "There are two different types of \"[tasks](https://docs.oggm.org/en/stable/api.html#entity-tasks)\" in OGGM:\n", "\n", "**Entity Tasks**\n", - ": Standalone operations to be realized on one single glacier entity, independently from the others. The majority of OGGM tasks are entity tasks. They are parallelisable: the same task can run on several glaciers in parallel.\n", + ": Standalone operations to be realized on one single glacier entity, independently of the others. The majority of OGGM tasks are entity tasks. They are parallelisable: the same task can run on several glaciers in parallel.\n", "\n", "**Global Tasks**\n", ": Tasks which require to work on several glacier entities at the same time. Model parameter calibration or the compilation of several glaciers' output are examples of global tasks. \n", @@ -539,6 +539,7 @@ "## What's next?\n", "\n", "- visit the next tutorial: 10 minutes to... [a glacier change projection with GCM data](run_with_gcm.ipynb)\n", + "- do you want to understand how the preprocessed directories are built? Check out [this step-by-step guide](../tutorials/full_prepro_workflow.ipynb)\n", "- back to the [table of contents](../welcome.ipynb)\n", "- return to the [OGGM documentation](https://docs.oggm.org)" ] diff --git a/notebooks/10minutes/run_with_gcm.ipynb b/notebooks/10minutes/run_with_gcm.ipynb index 3f69fc8c..fb364093 100644 --- a/notebooks/10minutes/run_with_gcm.ipynb +++ b/notebooks/10minutes/run_with_gcm.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In this example, we illustrate how to do a typical \"projection run\", i.e. using GCM data. Here we will first use already bias-corrected CMIP6 data from [ISIMIP3b](https://www.isimip.org/gettingstarted/isimip3b-bias-adjustment) and than show how alternatives like the original CMIP5 and CMIP6 data can be used. \n", + "In this example, we illustrate how to do a typical \"projection run\", i.e. using GCM data. Here we will first use already bias-corrected CMIP6 data from [ISIMIP3b](https://www.isimip.org/gettingstarted/isimip3b-bias-adjustment) and then show how alternatives like the original CMIP5 and CMIP6 data can be used.\n", "\n", "There are three important steps:\n", "- download the OGGM pre-processed directories containing a pre-calibrated and spun-up glacier model\n", @@ -35,7 +35,6 @@ "outputs": [], "source": [ "# Libs\n", - "import xarray as xr\n", "import matplotlib.pyplot as plt\n", "\n", "# Locals\n", @@ -108,7 +107,7 @@ "source": [ "ds = utils.compile_run_output(gdirs, input_filesuffix='_spinup_historical')\n", "vol_ref2000 = ds.volume / ds.volume.sel(time=2000) * 100\n", - "vol_ref2000.plot(hue='rgi_id');\n", + "vol_ref2000.plot(hue='rgi_id')\n", "plt.ylabel('Volume (%, reference 2000)');" ] }, @@ -116,7 +115,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Each RGI glacier has an \"inventory date\", the time at which the ouline is valid:" + "Each RGI glacier has an \"inventory date\", the time at which the outline is valid:" ] }, { @@ -148,7 +147,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "A typical use case for OGGM will be to use climate model output (here bias-corrected CMIP6 GCMs from [ISIMIP3b](https://www.isimip.org/gettingstarted/isimip3b-bias-adjustment/)). We use the files [we mirrored in Bremen](https://cluster.klima.uni-bremen.de/~oggm/cmip6/isimip3b/flat/monthly/) here, but you can use whichever you want. From ISIMIP3b, we have 5 GCMs and 3 SSPs on the cluster. You can find more information on the [ISIMIP website](https://www.isimip.org/gettingstarted/isimip3b-bias-adjustment). Let's download the data:" + "A typical use case for OGGM will be to use climate model output (here bias-corrected CMIP6 GCMs from [ISIMIP3b](https://www.isimip.org/gettingstarted/isimip3b-bias-adjustment/)). We use the files [we mirrored in Bremen](https://cluster.klima.uni-bremen.de/~oggm/cmip6/isimip3b/flat/monthly/) here, but you can use whichever you want. From ISIMIP3b, we have 14 GCMs with at least three SSPs per GCM on the cluster (check out https://cluster.klima.uni-bremen.de/~oggm/cmip6/isimip3b/flat/2025.11.25/monthly/ to see the names of the GCMs). You can find more information on the [ISIMIP3b Zenodo](https://data.isimip.org/10.48364/ISIMIP.581124.2). Let's download the data:" ] }, { @@ -159,7 +158,8 @@ }, "outputs": [], "source": [ - "# you can choose one of these 5 different GCMs:\n", + "# you can choose from in total 14 different climate models (GCMs).\n", + "# Here are for example the 5 primary GCMs:\n", "# 'gfdl-esm4_r1i1p1f1', 'mpi-esm1-2-hr_r1i1p1f1', 'mri-esm2-0_r1i1p1f1' (\"low sensitivity\" models, within typical ranges from AR6)\n", "# 'ipsl-cm6a-lr_r1i1p1f1', 'ukesm1-0-ll_r1i1p1f2' (\"hotter\" models, especially ukesm1-0-ll)\n", "member = 'mri-esm2-0_r1i1p1f1' \n", @@ -172,7 +172,7 @@ " member=member,\n", " # recognize the climate file for later\n", " output_filesuffix=f'_ISIMIP3b_{member}_{ssp}'\n", - " );" + " )" ] }, { @@ -224,7 +224,7 @@ " climate_input_filesuffix=rid, # use the chosen scenario\n", " init_model_filesuffix='_spinup_historical', # this is important! Start from 2020 glacier\n", " output_filesuffix=rid, # recognize the run for later\n", - " );" + " )" ] }, { @@ -250,8 +250,8 @@ " # Compile the output into one file\n", " ds = utils.compile_run_output(gdirs, input_filesuffix=rid)\n", " # Plot it\n", - " ds.isel(rgi_id=0).volume.plot(ax=ax1, label=ssp, c=color_dict[ssp]);\n", - " ds.isel(rgi_id=1).volume.plot(ax=ax2, label=ssp, c=color_dict[ssp]);\n", + " ds.isel(rgi_id=0).volume.plot(ax=ax1, label=ssp, c=color_dict[ssp])\n", + " ds.isel(rgi_id=1).volume.plot(ax=ax2, label=ssp, c=color_dict[ssp])\n", "plt.legend();" ] }, @@ -273,15 +273,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "ISIMIP data is very useful because it is bias corrected. Furthermore, it offers daily data (which we will soon make available in OGGM).\n", + "ISIMIP data is very useful because it is bias corrected. Furthermore, it offers daily data, which we will soon use in OGGM.\n", "\n", "But you may want a higher diversity of models or scenarios: for this, you may also use the CMIP5 or CMIP6 GCMs directly. These need to be bias-corrected first to the applied baseline climate (see [process_gcm_data](https://docs.oggm.org/en/stable/generated/oggm.tasks.process_gcm_data.html#oggm.shop.gcm_climate.process_gcm_data)). This relatively simple bias-correction is automatically done by `process_cmip_data` and is very important, as the model is very sensitive to temperature variability (see the following [blogpost](https://oggm.org/2021/08/05/mean-forcing/) for more details).\n", - "- CMIP5 has 4 different RCP scenarios and a variety of GCMs, online you can find them [here](https://cluster.klima.uni-bremen.de/~oggm/cmip5-ng). The above mentioned storage contains information about the data, [how to cite them](https://cluster.klima.uni-bremen.de/~oggm/cmip5-ng/README) and [tabular summaries](https://cluster.klima.uni-bremen.de/~oggm/cmip5-ng/all_gcm_table.html) of the available GCMs. \n", - "- CMIP6 has 4 different SSP scenarios, see [this table](https://cluster.klima.uni-bremen.de/~oggm/cmip6/all_gcm_table.html) for a summary of available GCMs. There are even some CMIP6 runs that go until [2300](https://cluster.klima.uni-bremen.de/~oggm/cmip6/gcm_table_2300.html).\n", + "- CMIP5 has 4 different RCP scenarios and a variety of GCMs, online you can find them [here](https://cluster.klima.uni-bremen.de/~oggm/cmip5-ng). The above-mentioned storage contains information about the data, [how to cite them](https://cluster.klima.uni-bremen.de/~oggm/cmip5-ng/README) and [tabular summaries](https://cluster.klima.uni-bremen.de/~oggm/cmip5-ng/all_gcm_table.html) of the available GCMs.\n", + "- CMIP6 has up to 8 different SSP scenarios, see [this table](https://cluster.klima.uni-bremen.de/~oggm/cmip6/gcm_table_2100.html) for a summary of available GCMs. There are even some CMIP6 runs that go until [2300](https://cluster.klima.uni-bremen.de/~oggm/cmip6/gcm_table_2300.html).\n", "\n", - "> Note, that the CMIP5 and CMIP6 files are much larger than the ISIMIP3b files. This is because we use a simple processing trick for the ISIMIP3b GCM files as we only save the glacier gridpoints, instead of the entire globe for CMIP5 and CMIP6.0 \n", + "> Note, that the CMIP5 and CMIP6 files are much larger than the ISIMIP3b files. This is because we use a simple processing trick for the ISIMIP3b GCM files as we only save the glacier gridpoints, instead of the entire globe for CMIP5 and CMIP6.0\n", "\n", - "**Therefore: run the following code only if it is ok to download a few gigabytes of data.** Set the variable below to true to run it. " + "**Therefore: run the following code only if it is ok to download a few gigabytes of data.** Set the variable below to true to run it.\n", + "(**Attention! This may take some time ...**)" ] }, { @@ -320,7 +321,7 @@ " filesuffix='_CMIP5_CCSM4_{}'.format(rcp), # recognize the climate file for later\n", " fpath_temp=ft, # temperature projections\n", " fpath_precip=fp, # precip projections\n", - " );\n", + " )\n", "\n", " # Run OGGM\n", " for rcp in ['rcp26', 'rcp45', 'rcp85']: #'rcp60',\n", @@ -330,16 +331,16 @@ " climate_input_filesuffix=rid, # use the chosen scenario\n", " init_model_filesuffix='_historical', # this is important! Start from 2020 glacier\n", " output_filesuffix=rid, # recognize the run for later\n", - " );\n", + " )\n", "\n", " # Plot\n", " f, (ax1, ax2) = plt.subplots(1, 2, figsize=(14, 4))\n", " for rcp in ['rcp26', 'rcp45', 'rcp85']: #'rcp60',\n", " rid = '_CMIP5_CCSM4_{}'.format(rcp)\n", " ds = utils.compile_run_output(gdirs, input_filesuffix=rid)\n", - " ds.isel(rgi_id=0).volume.plot(ax=ax1, label=rcp, c=color_dict_rcp[rcp]);\n", - " ds.isel(rgi_id=1).volume.plot(ax=ax2, label=rcp, c=color_dict_rcp[rcp]);\n", - " plt.legend();" + " ds.isel(rgi_id=0).volume.plot(ax=ax1, label=rcp, c=color_dict_rcp[rcp])\n", + " ds.isel(rgi_id=1).volume.plot(ax=ax2, label=rcp, c=color_dict_rcp[rcp])\n", + " plt.legend()" ] }, { @@ -348,7 +349,7 @@ "source": [ "Now, the same for CMIP6 but instead of RCPs, now SSPs and again with another GCM:\n", "\n", - "(**Attention! This may take some time ...**) Set the variable below to true to run it." + "Set the variable below to true to run it." ] }, { @@ -385,7 +386,7 @@ " filesuffix='_CMIP6_CESM2_{}'.format(ssp), # recognize the climate file for later\n", " fpath_temp=ft, # temperature projections\n", " fpath_precip=fp, # precip projections\n", - " );\n", + " )\n", "\n", " # Run OGGM\n", " for ssp in ['ssp126', 'ssp585']:\n", @@ -395,19 +396,56 @@ " climate_input_filesuffix=rid, # use the chosen scenario\n", " init_model_filesuffix='_historical', # this is important! Start from 2020 glacier\n", " output_filesuffix=rid, # recognize the run for later\n", - " );\n", + " )\n", "\n", " # Plot\n", " f, (ax1, ax2) = plt.subplots(1, 2, figsize=(14, 4))\n", " for ssp in ['ssp126', 'ssp585']:\n", " rid = '_CMIP6_CESM2_{}'.format(ssp)\n", " ds = utils.compile_run_output(gdirs, input_filesuffix=rid)\n", - " ds.isel(rgi_id=0).volume.plot(ax=ax1, label=ssp, c=color_dict[ssp]);\n", - " ds.isel(rgi_id=1).volume.plot(ax=ax2, label=ssp, c=color_dict[ssp]);\n", + " ds.isel(rgi_id=0).volume.plot(ax=ax1, label=ssp, c=color_dict[ssp])\n", + " ds.isel(rgi_id=1).volume.plot(ax=ax2, label=ssp, c=color_dict[ssp])\n", "\n", - " plt.legend();" + " plt.legend()" ] }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Have 5 minutes more? Do projections with another preprocessed glacier directory\n", + "\n", + "If you use the default preprocessed glacier directory (`DEFAULT_BASE_URL`), you do the same as in the [OGGM standard projections](https://docs.oggm.org/en/stable/download-projections.html). Per-glacier, regional, or global projections with this standard option are available directly at the [OGGM/oggm-standard-projections-csv-files repository](https://github.com/OGGM/oggm-standard-projections-csv-files).\n", + "\n", + "You can also do projections with another preprocessed glacier directory! We have several options of [preprocessed glacier directories available](https://docs.oggm.org/en/stable/shop.html#available-pre-processed-configurations).\n", + "If you want to e.g. use ERA5 instead of W5E5, you just have to update one of the lines above to\n", + "```python\n", + "new_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/elev_bands/ERA5/per_glacier_spinup/'\n", + "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=5, prepro_base_url=new_url)\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And then you can rerun all the cells below that line! Note that our processed ERA5 data (and thus, the historical runs) go until the end of 2025, and not just until the end of 2019." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, { "cell_type": "markdown", "metadata": {}, diff --git a/notebooks/construction/area_length_filter.ipynb b/notebooks/construction/area_length_filter.ipynb index 7e76d3ea..3770d0e5 100644 --- a/notebooks/construction/area_length_filter.ipynb +++ b/notebooks/construction/area_length_filter.ipynb @@ -64,7 +64,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We take the Kesselwandferner in the Austrian Alps:" + "We take the Hintereisferner in the Austrian Alps:" ] }, { @@ -73,7 +73,7 @@ "metadata": {}, "outputs": [], "source": [ - "rgi_ids = ['RGI60-11.00787']" + "rgi_ids = ['RGI60-11.00897'] # changed to HEF, because KWF does not show any spikes" ] }, { @@ -92,7 +92,7 @@ "# in OGGM v1.6 you have to explicitly indicate the url from where you want to start from\n", "# we will use here the elevation band flowlines which are much simpler than the centerlines\n", "base_url = ('https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/'\n", - " 'L3-L5_files/2023.3/elev_bands/W5E5/')\n", + " 'L3-L5_files/2025.6/elev_bands/W5E5/per_glacier')\n", "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=5, prepro_border=80,\n", " prepro_base_url=base_url)" ] @@ -118,7 +118,7 @@ "outputs": [], "source": [ "workflow.execute_entity_task(tasks.run_random_climate, gdirs,\n", - " nyears=200, y0=2000, seed=5,\n", + " nyears=200, y0=1995, seed=5,\n", " output_filesuffix='_commitment');" ] }, @@ -206,8 +206,8 @@ "outputs": [], "source": [ "# Plot\n", - "ds.area.plot(label='Original');\n", - "ts.plot(label='Filtered');\n", + "ds.area.plot(label='Original')\n", + "ts.plot(label='Filtered')\n", "plt.legend();" ] }, @@ -229,8 +229,8 @@ "ts = ts.rolling(roll_yrs).min()\n", "ts.iloc[0:roll_yrs] = ts.iloc[roll_yrs]\n", "# Plot\n", - "ds.length.plot(label='Original');\n", - "ts.plot(label='Filtered');\n", + "ds.length.plot(label='Original')\n", + "ts.plot(label='Filtered')\n", "plt.legend();" ] }, diff --git a/notebooks/tutorials/building_the_prepro_gdirs.ipynb b/notebooks/tutorials/building_the_prepro_gdirs.ipynb index 18644ef0..738f6837 100644 --- a/notebooks/tutorials/building_the_prepro_gdirs.ipynb +++ b/notebooks/tutorials/building_the_prepro_gdirs.ipynb @@ -76,7 +76,7 @@ }, "outputs": [], "source": [ - "# we always need to initialzie and define a working directory\n", + "# we always need to initialise and define a working directory\n", "cfg.initialize(logging_level='WARNING')\n", "cfg.PATHS['working_dir'] = utils.gettempdir(dirname='OGGM-full_prepro_elevation_bands', reset=True)" ] @@ -104,7 +104,7 @@ }, "outputs": [], "source": [ - "# This section is only for future developments of the tutorial (e.g. updateing for new OGGM releases)\n", + "# This section is only for future developments of the tutorial (e.g. updating for new OGGM releases)\n", "# Test if prepro_base_url valid for both flowline_type_to_use, see level 2.\n", "# In total four complete executions of the notebook:\n", "# (load_from_prepro_base_url=False/True and flowline_type_to_use = 'elevation_band'/'centerline')\n", @@ -177,7 +177,7 @@ "# Instruction for beginning with existing OGGM's preprocessed directories\n", "if load_from_prepro_base_url:\n", " # to start from level 0 you can do\n", - " prepro_base_url_L0 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/elev_bands/'\n", + " prepro_base_url_L0 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/2025.6/elev_bands/'\n", " gdirs = workflow.init_glacier_directories(rgi_ids,\n", " from_prepro_level=0,\n", " prepro_base_url=prepro_base_url_L0,\n", @@ -255,7 +255,7 @@ "# Instruction for beginning with existing OGGM's preprocessed directories\n", "if load_from_prepro_base_url:\n", " # to start from level 1 you can do\n", - " prepro_base_url_L1 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/elev_bands/'\n", + " prepro_base_url_L1 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/2025.6/elev_bands/'\n", " gdirs = workflow.init_glacier_directories(rgi_ids,\n", " from_prepro_level=1,\n", " prepro_base_url=prepro_base_url_L1,\n", @@ -321,10 +321,10 @@ " ]\n", "\n", " for task in elevation_band_task_list:\n", - " workflow.execute_entity_task(task, gdirs);\n", + " workflow.execute_entity_task(task, gdirs)\n", "\n", "elif flowline_type_to_use == 'centerline':\n", - " # for centerline we can use parabola downstream line\n", + " # for centerlines we can use parabola downstream line\n", " cfg.PARAMS['downstream_line_shape'] = 'parabola'\n", "\n", " centerline_task_list = [\n", @@ -359,9 +359,9 @@ "if load_from_prepro_base_url:\n", " # to start from level 2 we need to distinguish between the flowline types\n", " if flowline_type_to_use == 'elevation_band':\n", - " prepro_base_url_L2 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/2023.2/elev_bands_w_data/'\n", + " prepro_base_url_L2 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/2025.6/elev_bands_w_data/'\n", " elif flowline_type_to_use == 'centerline':\n", - " prepro_base_url_L2 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/centerlines/'\n", + " prepro_base_url_L2 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L1-L2_files/2025.6/centerlines/'\n", " else:\n", " raise ValueError(f\"Unknown flowline type '{flowline_type_to_use}'! Select 'elevation_band' or 'centerline'!\")\n", "\n", @@ -424,24 +424,24 @@ "cfg.PARAMS['baseline_climate'] = cfg.PARAMS['baseline_climate']\n", "\n", "# add climate data to gdir\n", - "workflow.execute_entity_task(tasks.process_climate_data, gdirs);\n", + "workflow.execute_entity_task(tasks.process_climate_data, gdirs)\n", "\n", "# the default mb calibration\n", "workflow.execute_entity_task(tasks.mb_calibration_from_geodetic_mb,\n", " gdirs,\n", " informed_threestep=True, # only available for 'GSWP3_W5E5'\n", - " );\n", + " )\n", "\n", "# glacier bed inversion\n", - "workflow.execute_entity_task(tasks.apparent_mb_from_any_mb, gdirs);\n", + "workflow.execute_entity_task(tasks.apparent_mb_from_any_mb, gdirs)\n", "workflow.calibrate_inversion_from_consensus(\n", " gdirs,\n", " apply_fs_on_mismatch=True,\n", - " error_on_mismatch=True, # if you running many glaciers some might not work\n", + " error_on_mismatch=True, # if you are running many glaciers some might not work\n", " filter_inversion_output=True, # this partly filters the overdeepening due to\n", " # the equilibrium assumption for retreating glaciers (see. Figure 5 of Maussion et al. 2019)\n", " volume_m3_reference=None, # here you could provide your own total volume estimate in m3\n", - ");\n", + ")\n", "\n", "# finally create the dynamic flowlines\n", "workflow.execute_entity_task(tasks.init_present_time_glacier, gdirs);" @@ -501,9 +501,10 @@ "if load_from_prepro_base_url:\n", " # to start from level 3 you can do\n", " if flowline_type_to_use == 'elevation_band':\n", - " prepro_base_url_L3 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/elev_bands/W5E5/'\n", + " prepro_base_url_L3 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/elev_bands/W5E5/per_glacier/'\n", " elif flowline_type_to_use == 'centerline':\n", - " prepro_base_url_L3 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/centerlines/W5E5/'\n", + " ###for centerlines, we only provide the spinup gdir for 2025.6. However, as we redo the steps, it does not matter\n", + " prepro_base_url_L3 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/centerlines/W5E5/per_glacier_spinup/'\n", " else:\n", " raise ValueError(f\"Unknown flowline type '{flowline_type_to_use}'! Select 'elevation_band' or 'centerline'!\")\n", "\n", @@ -566,8 +567,8 @@ "minimise_for = 'area' # other option would be 'volume'\n", "workflow.execute_entity_task(\n", " tasks.run_dynamic_melt_f_calibration, gdirs,\n", - " err_dmdtda_scaling_factor=0.2, # by default we reduce the mass balance error for accounting for\n", - " # corrleated uncertainties on a regional scale\n", + " err_dmdtda_scaling_factor=0.2, # by default, we reduce the mass balance error for accounting for\n", + " # correlated uncertainties on a regional scale\n", " ys=dynamic_spinup_start_year, ye=ye,\n", " kwargs_run_function={'minimise_for': minimise_for},\n", " ignore_errors=True,\n", @@ -591,7 +592,8 @@ " if flowline_type_to_use == 'elevation_band':\n", " prepro_base_url_L4 = DEFAULT_BASE_URL\n", " elif flowline_type_to_use == 'centerline':\n", - " prepro_base_url_L4 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/centerlines/W5E5/'\n", + " ###for centerlines, we only provide the spinup gdir for 2025.6. However, in most cases this is what you want to use anyways.\n", + " prepro_base_url_L4 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/centerlines/W5E5/per_glacier_spinup/'\n", " else:\n", " raise ValueError(f\"Unknown flowline type '{flowline_type_to_use}'! Select 'elevation_band' or 'centerline'!\")\n", " gdirs = workflow.init_glacier_directories(rgi_ids,\n", @@ -661,7 +663,8 @@ " if flowline_type_to_use == 'elevation_band':\n", " prepro_base_url_L5 = DEFAULT_BASE_URL\n", " elif flowline_type_to_use == 'centerline':\n", - " prepro_base_url_L5 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/centerlines/W5E5/'\n", + " ###for centerlines, we only provide the spinup gdir for 2025.6. However, in most cases this is what you want to use anyways.\n", + " prepro_base_url_L5 = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/centerlines/W5E5/per_glacier_spinup/'\n", " else:\n", " raise ValueError(f\"Unknown flowline type '{flowline_type_to_use}'! Select 'elevation_band' or 'centerline'!\")\n", " gdirs = workflow.init_glacier_directories(rgi_ids,\n", @@ -691,6 +694,14 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8bf9e338d3864230", + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/centerlines_to_shape.ipynb b/notebooks/tutorials/centerlines_to_shape.ipynb index c4fd8b57..bae2df04 100644 --- a/notebooks/tutorials/centerlines_to_shape.ipynb +++ b/notebooks/tutorials/centerlines_to_shape.ipynb @@ -70,7 +70,7 @@ "dem = rioxr.open_rasterio(fpath_dem)\n", "\n", "f, ax = plt.subplots(figsize=(9, 9))\n", - "dem.plot(ax=ax, cmap='terrain', vmin=0);\n", + "dem.plot(ax=ax, cmap='terrain', vmin=0)\n", "inventory.plot(ax=ax, edgecolor='k', facecolor='C1');" ] }, @@ -199,9 +199,9 @@ "source": [ "gdirs = workflow.init_glacier_directories(gdf)\n", "\n", - "workflow.execute_entity_task(tasks.define_glacier_region, gdirs, source='USER'); # Use the user DEM\n", + "workflow.execute_entity_task(tasks.define_glacier_region, gdirs, source='USER') # Use the user DEM\n", "\n", - "workflow.execute_entity_task(tasks.glacier_masks, gdirs);\n", + "workflow.execute_entity_task(tasks.glacier_masks, gdirs)\n", "workflow.execute_entity_task(tasks.compute_centerlines, gdirs);" ] }, @@ -268,7 +268,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "`LE_SEGMENT` is the length of the centerline in meters. The RGI \"IDs\" are fake (OGGM needs them) but the breID are real. Lets use them as index for the file:" + "`LE_SEGMENT` is the length of the centerline in meters. The RGI \"IDs\" are fake (OGGM needs them) but the breID are real. Let's use them as index for the file:" ] }, { @@ -297,7 +297,7 @@ "sel_breID = 1189 # 5570\n", "\n", "f, ax = plt.subplots(figsize=(9, 4))\n", - "orig_inventory.loc[[sel_breID]].plot(ax=ax, facecolor='lightblue');\n", + "orig_inventory.loc[[sel_breID]].plot(ax=ax, facecolor='lightblue')\n", "cls_default.loc[[sel_breID]].plot(ax=ax);" ] }, @@ -360,8 +360,8 @@ "sel_breID = 1189\n", "\n", "f, ax = plt.subplots(figsize=(9, 4))\n", - "orig_inventory.loc[[sel_breID]].plot(ax=ax, facecolor='lightblue');\n", - "cls_default.loc[[sel_breID]].plot(ax=ax, color='C0', alpha=0.5);\n", + "orig_inventory.loc[[sel_breID]].plot(ax=ax, facecolor='lightblue')\n", + "cls_default.loc[[sel_breID]].plot(ax=ax, color='C0', alpha=0.5)\n", "cls_smooth.loc[[sel_breID]].plot(ax=ax, color='C3');" ] }, @@ -376,7 +376,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "While the centerline algorithm is quite robust, the results will vary as a function of the resolution of the underlying grid, and the smoothing options. After trying a little, it seems difficult to find a setting which works \"best\" in all circumstances, and we encourage users to try several options and see what they prefer. The option likely to have the most impact (assuming smoothing with `(0.5, 5)` is the underlying grid resolution." + "While the centerline algorithm is quite robust, the results will vary as a function of the resolution of the underlying grid, and the smoothing options. After trying a little, it seems difficult to find a setting which works \"best\" in all circumstances, and we encourage users to try several options and see what they prefer. The option likely to have the most impact (assuming smoothing with `(0.5, 5)`) is the underlying grid resolution." ] }, { @@ -388,6 +388,13 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/deal_with_errors.ipynb b/notebooks/tutorials/deal_with_errors.ipynb index 0ce01446..c68ee968 100644 --- a/notebooks/tutorials/deal_with_errors.ipynb +++ b/notebooks/tutorials/deal_with_errors.ipynb @@ -13,7 +13,7 @@ "source": [ "In this example, we run the model on a list of three glaciers:\n", "two of them will end with errors: one because it already failed at\n", - "preprocessing (i.e. prior to this run), and one during the run. We show how to analyze theses erros and solve (some) of them, as described in the OGGM documentation under [troubleshooting](https://docs.oggm.org/en/stable/faq.html?highlight=border#troubleshooting)." + "preprocessing (i.e. prior to this run), and one during the run. We show how to analyze these errors and solve (some) of them, as described in the OGGM documentation under [troubleshooting](https://docs.oggm.org/en/stable/faq.html?highlight=border#troubleshooting)." ] }, { @@ -53,7 +53,7 @@ "cfg.PARAMS['use_multiprocessing'] = True\n", "\n", "# This is the important bit!\n", - "# We tell OGGM to continue despite of errors\n", + "# We tell OGGM to continue despite errors\n", "cfg.PARAMS['continue_on_error'] = True\n", "\n", "# Local working directory (where OGGM will write its output)\n", @@ -91,7 +91,7 @@ "outputs": [], "source": [ "# Write the compiled output\n", - "utils.compile_glacier_statistics(gdirs); # saved as glacier_statistics.csv in the WORKING_DIR folder\n", + "utils.compile_glacier_statistics(gdirs) # saved as glacier_statistics.csv in the WORKING_DIR folder\n", "utils.compile_run_output(gdirs); # saved as run_output.nc in the WORKING_DIR folder" ] }, @@ -197,7 +197,7 @@ "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=5, prepro_base_url=DEFAULT_BASE_URL)\n", "workflow.execute_entity_task(tasks.run_random_climate, gdirs, y0=2000,\n", " nyears=150, seed=0,\n", - " temperature_bias=-2);\n", + " temperature_bias=-2)\n", "\n", "# recompute the output\n", "# we can also get the run output directly from the methods\n", @@ -235,7 +235,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This error message in the log is misleading: it does not really describe the source of the error, which happened earlier in the processing chain. Therefore we can look instead into the glacier_statistics via [compile_glacier_statistics](https://docs.oggm.org/en/stable/generated/oggm.utils.compile_glacier_statistics.html) or into the log output via [compile_task_log](https://docs.oggm.org/en/stable/generated/oggm.utils.compile_task_log.html#oggm.utils.compile_task_log):" + "This error message in the log is misleading: it does not really describe the source of the error, which happened earlier in the processing chain. Therefore, we can look instead into the glacier_statistics via [compile_glacier_statistics](https://docs.oggm.org/en/stable/generated/oggm.utils.compile_glacier_statistics.html) or into the log output via [compile_task_log](https://docs.oggm.org/en/stable/generated/oggm.utils.compile_task_log.html#oggm.utils.compile_task_log):" ] }, { diff --git a/notebooks/tutorials/dem_sources.ipynb b/notebooks/tutorials/dem_sources.ipynb index f2e461b7..b19d0280 100644 --- a/notebooks/tutorials/dem_sources.ipynb +++ b/notebooks/tutorials/dem_sources.ipynb @@ -72,7 +72,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If not specifying anything, OGGM will use it's default settings, i.e. NASADEM for mid- and low-latitudes (60°S-60°N). However, this needs registration at [NASA Earthdata](https://urs.earthdata.nasa.gov/) (see \"Register\" below). Here, we choose the **SRTM** source as example DEM (no registration necessary)." + "If not specifying anything, OGGM will use its default settings, i.e. NASADEM for mid- and low-latitudes (60°S-60°N). However, this needs registration at [NASA Earthdata](https://urs.earthdata.nasa.gov/) (see \"Register\" below). Here, we choose the **SRTM** source as example DEM (no registration necessary)." ] }, { @@ -109,7 +109,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It is a geotiff file. [Xarray](http://xarray.pydata.org) can open them thanks to [rasterio](https://rasterio.readthedocs.io):" + "It is a geotiff file. [Xarray](https://xarray.pydata.org) can open them thanks to [rasterio](https://rasterio.readthedocs.io):" ] }, { @@ -120,7 +120,7 @@ "source": [ "da = rioxr.open_rasterio(dem_path)\n", "f, ax = plt.subplots()\n", - "da.plot(cmap='terrain', ax=ax);\n", + "da.plot(cmap='terrain', ax=ax)\n", "# Add the outlines\n", "gdir.read_shapefile('outlines').plot(ax=ax, color='none', edgecolor='black');" ] @@ -146,7 +146,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "**OGGM is neither the owner nor the distributer of these datasets! OGGM only provides tools to access it. It is your responsibility as the data user to read the individual usage requirements and cite and acknowledge the original data sources accordingly.**" + "**OGGM is neither the owner nor the distributor of these datasets! OGGM only provides tools to access it. It is your responsibility as the data user to read the individual usage requirements and cite and acknowledge the original data sources accordingly.**" ] }, { @@ -194,7 +194,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The [RGI-TOPO](https://rgitools.readthedocs.io/en/latest/dems.html) dataset is an RGI-provided dataset in beta release. These data are available for everyone, and were created with OGGM. Of course you can easily use these data in OGGM as well:" + "The [RGI-TOPO](https://rgitools.readthedocs.io/en/latest/dems.html) dataset is an RGI-provided dataset in beta release. These data are available for everyone, and were created with OGGM. Of course, you can easily use these data in OGGM as well:" ] }, { @@ -264,7 +264,7 @@ "source": [ "f, ax = plt.subplots()\n", "da_dem3 = rioxr.open_rasterio(gdir.get_filepath('dem'))\n", - "da_dem3.plot(cmap='terrain', ax=ax);\n", + "da_dem3.plot(cmap='terrain', ax=ax)\n", "gdir.read_shapefile('outlines').plot(ax=ax, color='none', edgecolor='black');" ] }, @@ -282,8 +282,8 @@ "outputs": [], "source": [ "f, ax = plt.subplots()\n", - "(da_dem3 - da).plot(ax=ax);\n", - "plt.title('DEM3 - SRTM');\n", + "(da_dem3 - da).plot(ax=ax)\n", + "plt.title('DEM3 - SRTM')\n", "gdir.read_shapefile('outlines').plot(ax=ax, color='none', edgecolor='black');" ] }, @@ -382,7 +382,7 @@ "source": [ "f, ax = plt.subplots()\n", "da_user = rioxr.open_rasterio(gdir.get_filepath('dem'))\n", - "da_user.plot(cmap='terrain', ax=ax);\n", + "da_user.plot(cmap='terrain', ax=ax)\n", "gdir.read_shapefile('outlines').plot(ax=ax, color='none', edgecolor='black');" ] }, @@ -390,7 +390,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The border value, or how to chose the size of the topographic map" + "## The border value, or how to choose the size of the topographic map" ] }, { @@ -399,7 +399,7 @@ "source": [ "It is possible to specify the extent of the local topographic map. All maps are centered on the glacier and the size of the map is determined in grid points around the glacier. The number of grid points that was used in this example are 10 in order to save storage. But depending on your study you might need a larger topographic map. \n", "\n", - "OGGM's [pre-processed directories](https://docs.oggm.org/en/stable/input-data.html#pre-processed-directories) come in 4 border sizes: 10, 40, 80 and 160. But if you process the topography yourself you can chose every value." + "OGGM's [pre-processed directories](https://docs.oggm.org/en/stable/input-data.html#pre-processed-directories) come in 4 border sizes: 10, 40, 80 and 160. But if you process the topography yourself you can choose every value." ] }, { @@ -433,7 +433,7 @@ "tasks.define_glacier_region(gdir)\n", "da = rioxr.open_rasterio(gdir.get_filepath('dem'))\n", "f, ax = plt.subplots()\n", - "da.plot(cmap='terrain', ax=ax);\n", + "da.plot(cmap='terrain', ax=ax)\n", "# Add the outlines\n", "gdir.read_shapefile('outlines').plot(ax=ax, color='none', edgecolor='black');" ] @@ -459,7 +459,7 @@ "tasks.define_glacier_region(gdir)\n", "da = rioxr.open_rasterio(gdir.get_filepath('dem'))\n", "f, ax = plt.subplots()\n", - "da.plot(cmap='terrain', ax=ax);\n", + "da.plot(cmap='terrain', ax=ax)\n", "# Add the outlines\n", "gdir.read_shapefile('outlines').plot(ax=ax, color='none', edgecolor='black');" ] @@ -473,6 +473,13 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/distribute_flowline.ipynb b/notebooks/tutorials/distribute_flowline.ipynb index dd6db05d..5ee1f10d 100644 --- a/notebooks/tutorials/distribute_flowline.ipynb +++ b/notebooks/tutorials/distribute_flowline.ipynb @@ -195,9 +195,9 @@ }, "outputs": [], "source": [ - "# Inititial glacier thickness\n", + "# Initial glacier thickness\n", "f, ax = plt.subplots()\n", - "ds.distributed_thickness.plot(ax=ax);\n", + "ds.distributed_thickness.plot(ax=ax)\n", "ax.axis('equal');" ] }, @@ -212,8 +212,8 @@ "source": [ "# Which points belongs to which band, and then within one band which are the first to melt\n", "f, (ax1, ax2) = plt.subplots(1, 2, figsize=(12, 4))\n", - "ds.band_index.plot.contourf(ax=ax1);\n", - "ds.rank_per_band.plot(ax=ax2);\n", + "ds.band_index.plot.contourf(ax=ax1)\n", + "ds.rank_per_band.plot(ax=ax2)\n", "ax1.axis('equal'); ax2.axis('equal'); plt.tight_layout();" ] }, @@ -294,10 +294,10 @@ "source": [ "def plot_distributed_thickness(ds, title):\n", " f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(14, 4))\n", - " ds.simulated_thickness.sel(time=2005).plot(ax=ax1, vmax=400);\n", - " ds.simulated_thickness.sel(time=2050).plot(ax=ax2, vmax=400);\n", - " ds.simulated_thickness.sel(time=2100).plot(ax=ax3, vmax=400);\n", - " ax1.axis('equal'); ax2.axis('equal'); f.suptitle(title, fontsize=20);\n", + " ds.simulated_thickness.sel(time=2005).plot(ax=ax1, vmax=400)\n", + " ds.simulated_thickness.sel(time=2050).plot(ax=ax2, vmax=400)\n", + " ds.simulated_thickness.sel(time=2100).plot(ax=ax3, vmax=400)\n", + " ax1.axis('equal'); ax2.axis('equal'); f.suptitle(title, fontsize=20)\n", " plt.tight_layout();\n", "\n", "plot_distributed_thickness(ds[0], 'Aletsch')\n", @@ -323,8 +323,8 @@ "source": [ "def plot_area(ds, gdir, title):\n", " area = (ds.simulated_thickness > 0).sum(dim=['x', 'y']) * gdir.grid.dx**2 * 1e-6\n", - " area.plot(label='Distributed area');\n", - " plt.hlines(gdir.rgi_area_km2, gdir.rgi_date, 2100, color='C3', linestyles='--', label='RGI Area');\n", + " area.plot(label='Distributed area')\n", + " plt.hlines(gdir.rgi_area_km2, gdir.rgi_date, 2100, color='C3', linestyles='--', label='RGI Area')\n", " plt.legend(loc='lower left'); plt.ylabel('Area [km2]'); plt.title(title, fontsize=20); plt.show();\n", "\n", "\n", @@ -351,7 +351,7 @@ "source": [ "def plot_volume(ds, gdir, title):\n", " vol = ds.simulated_thickness.sum(dim=['x', 'y']) * gdir.grid.dx**2 * 1e-9\n", - " vol.plot(label='Distributed volume'); plt.ylabel('Distributed volume [km3]');\n", + " vol.plot(label='Distributed volume'); plt.ylabel('Distributed volume [km3]')\n", " plt.title(title, fontsize=20); plt.show();\n", "\n", "\n", @@ -401,7 +401,7 @@ "outputs": [], "source": [ "from matplotlib import animation\n", - "from IPython.display import HTML, display\n", + "from IPython.display import HTML\n", "\n", "# Get a handle on the figure and the axes\n", "fig, ax = plt.subplots()\n", @@ -492,7 +492,7 @@ " output_filesuffix='_random_s1_smooth', # do not overwrite the previous file (optional) \n", " # add_monthly=True, # more frames! (12 times more - we comment for the demo, but recommend it)\n", " rolling_mean_smoothing=7, # smooth the area time series\n", - " fl_thickness_threshold=1, # avoid snow patches to be nisclassified\n", + " fl_thickness_threshold=1, # avoid snow patches to be misclassified\n", " )" ] }, @@ -507,10 +507,10 @@ "source": [ "def plot_area_smoothed(ds_smooth, ds, gdir, title):\n", " area = (ds.simulated_thickness > 0).sum(dim=['x', 'y']) * gdir.grid.dx**2 * 1e-6\n", - " area.plot(label='Distributed area (raw)');\n", + " area.plot(label='Distributed area (raw)')\n", " area = (ds_smooth.simulated_thickness > 0).sum(dim=['x', 'y']) * gdir.grid.dx**2 * 1e-6\n", - " area.plot(label='Distributed area (smooth)');\n", - " plt.legend(loc='lower left'); plt.ylabel('Area [km2]');\n", + " area.plot(label='Distributed area (smooth)')\n", + " plt.legend(loc='lower left'); plt.ylabel('Area [km2]')\n", " plt.title(title, fontsize=20); plt.show();\n", "\n", "\n", @@ -605,8 +605,8 @@ "distribute_2d.merge_simulated_thickness(\n", " gdirs, # the gdirs we want to merge\n", " simulation_filesuffix=simulation_filesuffix, # the name of the simulation\n", - " years_to_merge=np.arange(2005, 2101, 5), # for demonstration I only pick some years, if this is None all years are merged\n", - " add_topography=True, # if you do not need topogrpahy setting this to False will decrease computing time\n", + " years_to_merge=np.arange(2005, 2101, 5), # for demonstration, I only pick some years, if this is None all years are merged\n", + " add_topography=True, # if you do not need topography setting this to False will decrease computing time\n", " preserve_totals=True, # preserve individual glacier volumes during merging\n", " reset=True,\n", ")" @@ -819,6 +819,14 @@ "source": [ "The WIP tool is available here: https://github.com/OGGM/oggm-3dviz" ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9c7a9237721c1d87", + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/dynamical_spinup.ipynb b/notebooks/tutorials/dynamical_spinup.ipynb index 95d91fb1..337900a1 100644 --- a/notebooks/tutorials/dynamical_spinup.ipynb +++ b/notebooks/tutorials/dynamical_spinup.ipynb @@ -19,12 +19,12 @@ "\n", "However, running simulations in the recent past can be quite useful for model validation. Also, more direct observations are available of glacier states of the recent past for constriction (e.g. area, geodetic mass balance). Further, a dynamical initialisation can release strong assumptions in the OGGM default settings: first that glaciers are in dynamical equilibrium at the glacier outline date (an assumption required for the ice thickness inversion) and second that the mass balance melt factor parameter (*melt_f*) is calibrated towards a geodetic mass balance ignoring a dynamically changing glacier geometry.\n", "\n", - "In recent PRs ([GH1342](https://github.com/OGGM/oggm/pull/1342), [GH1232](https://github.com/OGGM/oggm/pull/1232), [GH1361](https://github.com/OGGM/oggm/pull/1361) and [GH1425](https://github.com/OGGM/oggm/pull/1425)) we have released two new run tasks in OGGM which help with this issues:\n", + "In recent PRs ([GH1342](https://github.com/OGGM/oggm/pull/1342), [GH1232](https://github.com/OGGM/oggm/pull/1232), [GH1361](https://github.com/OGGM/oggm/pull/1361) and [GH1425](https://github.com/OGGM/oggm/pull/1425)) we have released two new run tasks in OGGM which help with this issue:\n", "\n", "- The ```run_dynamic_spinup``` task, by default, aims to find a glacier state before the RGI-date (~10-30 years back) from which the glacier evolves to match the area given by the RGI-outline. Alternatively, it is also possible to use this task to match an observed volume.\n", "- The ```run_dynamic_melt_f_calibration``` task iteratively searches for a *melt_f* to match the observed geodetic mass balance taking a dynamically changing glacier geometry into account.\n", "\n", - "And of course, we want to match both things in the same past model run. Therefore by default in each iteration of ```run_dynamic_melt_f_calibration``` the ```run_dynamic_spinup``` function is included. A more in-depth explanation of the two tasks is provided in the next two chapters, which are followed by an example and a comparison of the different spinup options. \n", + "And of course, we want to match both things in the same past model run. Therefore, by default in each iteration of ```run_dynamic_melt_f_calibration``` the ```run_dynamic_spinup``` function is included. A more in-depth explanation of the two tasks is provided in the next two chapters, which are followed by an example and a comparison of the different spinup options.\n", "\n", "## High-level explanation of ```run_dynamic_spinup```:\n", "\n", @@ -37,7 +37,7 @@ "- Get the model area or volume of the glacier at *t_end*.\n", "- Compare the model value to the reference value we want to meet.\n", "- If the difference is inside a given precision, stop the procedure and save the glacier evolution of this run.\n", - "- If the difference is outside a given precision, change the temperature bias for mb_spinup and start over again (how the next guess is found is descript [here](#The-minimization-algorithm:)).\n", + "- If the difference is outside a given precision, change the temperature bias for mb_spinup and start over again (how the next guess is found is descript [here](#the-minimization-algorithm)).\n", "\n", "With OGGM version 1.6.1 it is now also possible to provide a custom target year ```target_yr``` with target value ```target_value``` to match. This could be useful if you have more data available than on a global scale (e.g. an extra outline at a later date than the RGI).\n", "\n", @@ -54,7 +54,7 @@ "\n", "## High-level explanation of ```run_dynamic_melt_f_calibration ```:\n", "\n", - "This task iteratively searches for a *melt_f* to match a given geodetic mass balance incorporating a dynamic model run. But changing *melt_f* means we need to rerun all model setup steps which incorporate the mass-balance, to have one consistent model initialisation chain. In particular, we need to conduct the bed inversion again (the mass-balance is used in the flux calculation, see [here](https://docs.oggm.org/en/latest/inversion.html#ice-flux)). Therefore one default iteration of the dynamic *melt_f* calibration looks like this:\n", + "This task iteratively searches for a *melt_f* to match a given geodetic mass balance incorporating a dynamic model run. But changing *melt_f* means we need to rerun all model setup steps which incorporate the mass-balance, to have one consistent model initialisation chain. In particular, we need to conduct the bed inversion again (the mass-balance is used in the flux calculation, see [here](https://docs.oggm.org/en/latest/inversion.html#ice-flux)). Therefore, one default iteration of the dynamic *melt_f* calibration looks like this:\n", "\n", "- define a new *melt_f* in the glacier directory\n", "- conduct an inversion which calibrates to the consensus volume, still assuming dynamic equilibrium. (a little tricky: by default, before we start with the dynamic *melt_f* calibration the first inversion with calibration on a regional scale was already carried out -> the individual glaciers do not match the consensus volume exactly, but when adding all glaciers of one region the consensus volume is matched. Therefore during this task, the individual glacier is again matched to the volume of the regional assessment and not on an individual basis.)\n", @@ -62,11 +62,11 @@ "- calculate the modelled geodetic mass balance\n", "- calculate the difference between modelled value and observation\n", "- if the difference is inside the observation uncertainty stop and save the model run (indicated in diagnostics with ```used_spinup_option = dynamic melt_f calibration (full success)```)\n", - "- if the difference is larger, define a new *melt_f* and start over again (how the next guess is found is described [here](#The-minimization-algorithm:))\n", + "- if the difference is larger, define a new *melt_f* and start over again (how the next guess is found is described [here](#the-minimization-algorithm))\n", "\n", "If the iterative search is not successful and ```ignore_errors = True``` there are several possible outcomes:\n", "\n", - "- First, it is checked if there were some successful runs which improved the mismatch. If so, the best run is saved and it is indicated in the diagnostics with ```used_spinup_option = dynamic melt_f calibration (part success)```\n", + "- First, it is checked if there were some successful runs which improved the mismatch. If so, the best run is saved, and it is indicated in the diagnostics with ```used_spinup_option = dynamic melt_f calibration (part success)```\n", "- If only the first guess worked this run is saved and indicated in the diagnostics with ```used_spinup_option = dynamic spinup only```\n", "- And if everything failed a fixed geometry spinup is conducted and indicated in the diagnostics with ```used_spinup_option = fixed geometry spinup```\n", "\n", @@ -74,7 +74,7 @@ "\n", "## The minimization algorithm:\n", "\n", - "To start, a first guess of the control variable (temperature bias or *melt_f*) is used and evaluated. If by chance, the mismatch between model and observation is close enough, the algorithm stops already. Otherwise, the second guess depends on the calculated first guess mismatch. For example, if the first resulting area is smaller (larger) than the searched one, the second temperature bias will be colder (warmer). Because a colder (warmer) temperature leads to a larger (smaller) initial glacier state at *t_start*. If the second guess is still unsuccessful, for all consecutive guesses the previous value pairs (control variable, mismatch) are used to determine the next guess. For this, a stepwise linear function is fitted to these pairs and afterwards, the mismatch is set to 0 to get the following guess (this method is similar to the one described in Zekollari et al. 2019 Appendix A). Moreover, a maximum step length between two guesses is defined as too large step-sizes could easily lead to failing model runs (e.g. see [here](#Two-main-problems-why-the-dynamic-spinup-could-not-work:)). Further, the algorithm was adapted independently inside ```run_dynamic_spinup``` and ```run_dynamic_melt_f_calibration``` to cope with failing model runs individually. Note that this minimization algorithm only works if the underlying relationship between the control variable and the mismatch is strictly monotone.\n", + "To start, a first guess of the control variable (temperature bias or *melt_f*) is used and evaluated. If by chance, the mismatch between model and observation is close enough, the algorithm stops already. Otherwise, the second guess depends on the calculated first guess mismatch. For example, if the first resulting area is smaller (larger) than the searched one, the second temperature bias will be colder (warmer). Because a colder (warmer) temperature leads to a larger (smaller) initial glacier state at *t_start*. If the second guess is still unsuccessful, for all consecutive guesses the previous value pairs (control variable, mismatch) are used to determine the next guess. For this, a stepwise linear function is fitted to these pairs and afterwards, the mismatch is set to 0 to get the following guess (this method is similar to the one described in Zekollari et al. 2019 Appendix A). Moreover, a maximum step length between two guesses is defined as too large step-sizes could easily lead to failing model runs (e.g. see [here](#two-main-problems-why-the-dynamic-spinup-could-not-work)). Further, the algorithm was adapted independently inside ```run_dynamic_spinup``` and ```run_dynamic_melt_f_calibration``` to cope with failing model runs individually. Note that this minimization algorithm only works if the underlying relationship between the control variable and the mismatch is strictly monotone.\n", "\n", "If someone is interested in how this algorithm works in more detail, here is a conceptual code snippet:" ] @@ -89,7 +89,6 @@ }, "outputs": [], "source": [ - "import numpy as np\n", "from scipy import interpolate\n", "\n", "def minimisation_algorithm(\n", @@ -168,7 +167,6 @@ "import matplotlib.pyplot as plt\n", "import xarray as xr\n", "import numpy as np\n", - "import pandas as pd\n", "import seaborn as sns" ] }, @@ -189,7 +187,7 @@ "metadata": {}, "outputs": [], "source": [ - "from oggm import cfg, utils, workflow, tasks, graphics" + "from oggm import cfg, utils, workflow, tasks" ] }, { @@ -239,9 +237,9 @@ "metadata": {}, "outputs": [], "source": [ - "# We use a recent gdir setting, calibated on a glacier per glacier basis\n", + "# We use a recent gdir setting, calibrated on a glacier per glacier basis\n", "base_url = ('https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/'\n", - " 'L3-L5_files/2023.3/elev_bands/W5E5/')" + " 'L3-L5_files/2025.6/elev_bands/W5E5/per_glacier')" ] }, { @@ -251,7 +249,8 @@ "outputs": [], "source": [ "# We use a relatively large border value to allow the glacier to grow during spinup\n", - "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3, prepro_border=160, prepro_base_url=base_url)" + "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3, prepro_border=80,\n", + " prepro_base_url=base_url)" ] }, { @@ -308,9 +307,9 @@ "source": [ "# ---- First do the fixed geometry spinup ----\n", "tasks.run_from_climate_data(gdir,\n", - " fixed_geometry_spinup_yr=spinup_start_yr, # Start the run at the RGI date but retro-actively correct the data with fixed geometry\n", + " fixed_geometry_spinup_yr=spinup_start_yr, # Start the run at the RGI date but retroactively correct the data with fixed geometry\n", " output_filesuffix='_hist_fixed_geom', # where to write the output\n", - " );\n", + " )\n", "# Read the output\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_hist_fixed_geom')) as ds:\n", " ds_hist = ds.load()\n", @@ -318,10 +317,10 @@ "# ---- Second the dynamic spinup alone, matching area ----\n", "tasks.run_dynamic_spinup(gdir,\n", " spinup_start_yr=spinup_start_yr, # When to start the spinup\n", - " minimise_for='area', # what target to match at the RGI date\n", + " minimise_for='area', # which target to match at the RGI date\n", " output_filesuffix='_spinup_dynamic_area', # Where to write the output\n", " ye=2020, # When the simulation should stop\n", - " );\n", + " )\n", "# Read the output\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_spinup_dynamic_area')) as ds:\n", " ds_dynamic_spinup_area = ds.load()\n", @@ -329,10 +328,10 @@ "# ---- Third the dynamic spinup alone, matching volume ----\n", "tasks.run_dynamic_spinup(gdir,\n", " spinup_start_yr=spinup_start_yr, # When to start the spinup\n", - " minimise_for='volume', # what target to match at the RGI date\n", + " minimise_for='volume', # which target to match at the RGI date\n", " output_filesuffix='_spinup_dynamic_volume', # Where to write the output\n", " ye=2020, # When the simulation should stop\n", - " );\n", + " )\n", "# Read the output\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_spinup_dynamic_volume')) as ds:\n", " ds_dynamic_spinup_volume = ds.load()\n", @@ -342,7 +341,7 @@ " ys=spinup_start_yr, # When to start the spinup\n", " ye=2020, # When the simulation should stop\n", " output_filesuffix='_dynamic_melt_f', # Where to write the output\n", - " );\n", + " )\n", "\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_dynamic_melt_f')) as ds:\n", " ds_dynamic_melt_f = ds.load()" @@ -354,37 +353,37 @@ "metadata": {}, "outputs": [], "source": [ - "# Now make a plot for comparision\n", + "# Now make a plot for comparison\n", "y0 = gdir.rgi_date + 1\n", "\n", "f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(16, 5))\n", "\n", - "ds_hist.volume_m3.plot(ax=ax1, label='Fixed geometry spinup');\n", - "ds_dynamic_melt_f.volume_m3.plot(ax=ax1, label='Dynamical melt_f calibration');\n", - "ds_dynamic_spinup_area.volume_m3.plot(ax=ax1, label='Dynamical spinup match area');\n", - "ds_dynamic_spinup_volume.volume_m3.plot(ax=ax1, label='Dynamical spinup match volume');\n", - "ax1.set_title('Volume');\n", - "ax1.scatter(y0, volume_reference, c='C3', label='Reference values');\n", - "ax1.legend();\n", - "\n", - "ds_hist.area_m2.plot(ax=ax2);\n", - "ds_dynamic_melt_f.area_m2.plot(ax=ax2);\n", - "ds_dynamic_spinup_area.area_m2.plot(ax=ax2);\n", - "ds_dynamic_spinup_volume.area_m2.plot(ax=ax2);\n", - "ax2.set_title('Area');\n", + "ds_hist.volume_m3.plot(ax=ax1, label='Fixed geometry spinup')\n", + "ds_dynamic_melt_f.volume_m3.plot(ax=ax1, label='Dynamical melt_f calibration')\n", + "ds_dynamic_spinup_area.volume_m3.plot(ax=ax1, label='Dynamical spinup match area')\n", + "ds_dynamic_spinup_volume.volume_m3.plot(ax=ax1, label='Dynamical spinup match volume')\n", + "ax1.set_title('Volume')\n", + "ax1.scatter(y0, volume_reference, c='C3', label='Reference values')\n", + "ax1.legend()\n", + "\n", + "ds_hist.area_m2.plot(ax=ax2)\n", + "ds_dynamic_melt_f.area_m2.plot(ax=ax2)\n", + "ds_dynamic_spinup_area.area_m2.plot(ax=ax2)\n", + "ds_dynamic_spinup_volume.area_m2.plot(ax=ax2)\n", + "ax2.set_title('Area')\n", "ax2.scatter(y0, area_reference, c='C3')\n", "\n", - "ds_hist.length_m.plot(ax=ax3);\n", - "ds_dynamic_melt_f.length_m.plot(ax=ax3);\n", - "ds_dynamic_spinup_area.length_m.plot(ax=ax3);\n", - "ds_dynamic_spinup_volume.length_m.plot(ax=ax3);\n", - "ax3.set_title('Length');\n", + "ds_hist.length_m.plot(ax=ax3)\n", + "ds_dynamic_melt_f.length_m.plot(ax=ax3)\n", + "ds_dynamic_spinup_area.length_m.plot(ax=ax3)\n", + "ds_dynamic_spinup_volume.length_m.plot(ax=ax3)\n", + "ax3.set_title('Length')\n", "ax3.scatter(y0, ds_hist.sel(time=y0).length_m, c='C3')\n", "\n", "plt.tight_layout()\n", - "plt.show();\n", + "plt.show()\n", "\n", - "# and print out the modeled geodetic mass balances for comparision\n", + "# and print out the modeled geodetic mass balances for comparison\n", "def get_dmdtda(ds):\n", " yr0_ref_mb, yr1_ref_mb = ref_period.split('_')\n", " yr0_ref_mb = int(yr0_ref_mb.split('-')[0])\n", @@ -423,7 +422,7 @@ "outputs": [], "source": [ "# define an artificial error for dmdtda\n", - "dmdtda_reference_error_artificial = 10 # error must be given as a positive number\n", + "dmdtda_reference_error_artificial = 30 # error must be given as a positive number\n", "\n", "tasks.run_dynamic_melt_f_calibration(gdir,\n", " ys=spinup_start_yr, # When to start the spinup\n", @@ -431,7 +430,7 @@ " output_filesuffix='_dynamic_melt_f_artificial', # Where to write the output\n", " ref_dmdtda=dmdtda_reference, # user-provided geodetic mass balance observation\n", " err_ref_dmdtda=dmdtda_reference_error_artificial, # uncertainty of user-provided geodetic mass balance observation \n", - " );\n", + " )\n", "\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_dynamic_melt_f_artificial')) as ds:\n", " ds_dynamic_melt_f_artificial = ds.load()" @@ -448,32 +447,32 @@ "\n", "f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(16, 5))\n", "\n", - "ds_hist.volume_m3.plot(ax=ax1, label='Fixed geometry spinup');\n", - "ds_dynamic_melt_f.volume_m3.plot(ax=ax1, label='Dynamical melt_f calibration (original error)');\n", - "ds_dynamic_melt_f_artificial.volume_m3.plot(ax=ax1, label='Dynamical melt_f calibration (aritificial error)');\n", - "ds_dynamic_spinup_area.volume_m3.plot(ax=ax1, label='Dynamical spinup match area');\n", - "ax1.set_title('Volume');\n", - "ax1.scatter(y0, volume_reference, c='C3', label='Reference values');\n", - "ax1.legend();\n", - "\n", - "ds_hist.area_m2.plot(ax=ax2);\n", - "ds_dynamic_melt_f.area_m2.plot(ax=ax2);\n", - "ds_dynamic_melt_f_artificial.area_m2.plot(ax=ax2);\n", - "ds_dynamic_spinup_area.area_m2.plot(ax=ax2);\n", - "ax2.set_title('Area');\n", + "ds_hist.volume_m3.plot(ax=ax1, label='Fixed geometry spinup')\n", + "ds_dynamic_melt_f.volume_m3.plot(ax=ax1, label='Dynamical melt_f calibration (original error)')\n", + "ds_dynamic_melt_f_artificial.volume_m3.plot(ax=ax1, label='Dynamical melt_f calibration (artificial error)')\n", + "ds_dynamic_spinup_area.volume_m3.plot(ax=ax1, label='Dynamical spinup match area')\n", + "ax1.set_title('Volume')\n", + "ax1.scatter(y0, volume_reference, c='C3', label='Reference values')\n", + "ax1.legend()\n", + "\n", + "ds_hist.area_m2.plot(ax=ax2)\n", + "ds_dynamic_melt_f.area_m2.plot(ax=ax2)\n", + "ds_dynamic_melt_f_artificial.area_m2.plot(ax=ax2)\n", + "ds_dynamic_spinup_area.area_m2.plot(ax=ax2)\n", + "ax2.set_title('Area')\n", "ax2.scatter(y0, area_reference, c='C3')\n", "\n", - "ds_hist.length_m.plot(ax=ax3);\n", - "ds_dynamic_melt_f.length_m.plot(ax=ax3);\n", - "ds_dynamic_melt_f_artificial.length_m.plot(ax=ax3);\n", - "ds_dynamic_spinup_area.length_m.plot(ax=ax3);\n", - "ax3.set_title('Length');\n", + "ds_hist.length_m.plot(ax=ax3)\n", + "ds_dynamic_melt_f.length_m.plot(ax=ax3)\n", + "ds_dynamic_melt_f_artificial.length_m.plot(ax=ax3)\n", + "ds_dynamic_spinup_area.length_m.plot(ax=ax3)\n", + "ax3.set_title('Length')\n", "ax3.scatter(y0, ds_hist.sel(time=y0).length_m, c='C3')\n", "\n", "plt.tight_layout()\n", "plt.show();\n", "\n", - "# and print out the modeled geodetic mass balances for comparision\n", + "# and print out the modeled geodetic mass balances for comparison\n", "def get_dmdtda(ds):\n", " yr0_ref_mb, yr1_ref_mb = ref_period.split('_')\n", " yr0_ref_mb = int(yr0_ref_mb.split('-')[0])\n", @@ -556,7 +555,7 @@ "source": [ "# ---- First do the fixed geometry spinup ----\n", "tasks.run_from_climate_data(gdir,\n", - " fixed_geometry_spinup_yr=spinup_start_yr, # Start the run at the RGI date but retro-actively correct the data with fixed geometry\n", + " fixed_geometry_spinup_yr=spinup_start_yr, # Start the run at the RGI date but retroactively correct the data with fixed geometry\n", " output_filesuffix='_hist_fixed_geom', # where to write the output\n", " );\n", "# Read the output\n", @@ -567,7 +566,7 @@ "tasks.run_dynamic_spinup(gdir,\n", " precision_percent=3, # For this glacier we only try to be within 3% or RGI_area\n", " spinup_start_yr=spinup_start_yr, # When to start the spinup\n", - " minimise_for='area', # what target to match at the RGI date\n", + " minimise_for='area', # which target to match at the RGI date\n", " output_filesuffix='_spinup_dynamic_area', # Where to write the output\n", " ye=2020, # When the simulation should stop\n", " );\n", @@ -578,7 +577,7 @@ "# ---- Third the dynamic spinup alone, matching volume ----\n", "tasks.run_dynamic_spinup(gdir,\n", " spinup_start_yr=spinup_start_yr, # When to start the spinup\n", - " minimise_for='volume', # what target to match at the RGI date\n", + " minimise_for='volume', # which target to match at the RGI date\n", " output_filesuffix='_spinup_dynamic_volume', # Where to write the output\n", " ye=2020, # When the simulation should stop\n", " );\n", @@ -594,7 +593,7 @@ "outputs": [], "source": [ "def plot_dynamic_spinup_bad_glacier():\n", - " # Now make a plot for comparision\n", + " # Now make a plot for comparison\n", " y0 = gdir.rgi_date + 1\n", "\n", " f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(16, 5))\n", @@ -621,7 +620,7 @@ " plt.tight_layout()\n", " plt.show();\n", "\n", - " # and print out the modeled geodetic mass balances for comparision\n", + " # and print out the modeled geodetic mass balances for comparison\n", " def get_dmdtda(ds):\n", " yr0_ref_mb, yr1_ref_mb = ref_period.split('_')\n", " yr0_ref_mb = int(yr0_ref_mb.split('-')[0])\n", @@ -640,9 +639,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In this example, you see that the dynamic spinup run matching area does not start in 1980. The reason for this are the two main problems and the coping strategy of reducing the spinup time, described [here](#Two-main-problems-why-the-dynamic-spinup-could-not-work:) in more detail.\n", + "In this example, you see that the dynamic spinup run matching area does not start in 1980. The reason for this are the two main problems and the coping strategy of reducing the spinup time, described [here](#two-main-problems-why-the-dynamic-spinup-could-not-work) in more detail.\n", "\n", - "To get an glacier evolution starting at 1980 you can use ```add_fixed_geometry_spinup = True```:" + "To get a glacier evolution starting at 1980 you can use ```add_fixed_geometry_spinup = True```:" ] }, { @@ -654,10 +653,10 @@ "tasks.run_dynamic_spinup(gdir,\n", " precision_percent=3, # For this glacier we only try to be within 3% or RGI_area\n", " spinup_start_yr=spinup_start_yr, # When to start the spinup\n", - " minimise_for='area', # what target to match at the RGI date\n", + " minimise_for='area', # which target to match at the RGI date\n", " output_filesuffix='_spinup_dynamic_area', # Where to write the output\n", " ye=2020, # When the simulation should stop\n", - " add_fixed_geometry_spinup=True, # add a fixed geometry spinup if period needs to be shortent\n", + " add_fixed_geometry_spinup=True, # add a fixed geometry spinup if period needs to be shorter\n", " );\n", "\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_spinup_dynamic_area')) as ds:\n", @@ -670,7 +669,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You see that a fixed geomtery was added to the dynamical spinup matching the area (orange curve). With this you can be sure that all glaciers start at the same year." + "You see that a fixed geometry was added to the dynamical spinup matching the area (orange curve). With this you can be sure that all glaciers start at the same year." ] }, { @@ -706,13 +705,13 @@ " store_monthly_hydro=True, # compute monthly hydro diagnostics\n", " ref_area_from_y0=True, # Even if the glacier may grow, keep the reference area as the year 0 of the simulation\n", " output_filesuffix='_spinup_dynamic_hydro', # Where to write the output - this is needed to stitch the runs together afterwards\n", - " );\n", + " )\n", "\n", "# Read the output\n", "with xr.open_dataset(gdir.get_filepath('model_diagnostics', filesuffix='_spinup_dynamic_hydro')) as ds:\n", " ds_dynamic_spinup_hydro = ds.load()\n", "\n", - "ds_dynamic_spinup_hydro = ds_dynamic_spinup_hydro.isel(time=slice(0, -1)) # The last timestep is incomplete for hydro (not started)" + "ds_dynamic_spinup_hydro = ds_dynamic_spinup_hydro.isel(time=slice(0, -1)); # The last timestep is incomplete for hydro (not started)" ] }, { @@ -765,7 +764,7 @@ "It consists of the following components:\n", "- melt off-glacier: snow melt on areas that are now glacier free (i.e. 0 in the year of largest glacier extent, in this example at the start of the simulation)\n", "- melt on-glacier: ice + seasonal snow melt on the glacier\n", - "- liquid precipitaton on- and off-glacier (the latter being zero at the year of largest glacial extent, in this example at start of the simulation)" + "- liquid precipitation on- and off-glacier (the latter being zero at the year of largest glacial extent, in this example at start of the simulation)" ] }, { @@ -774,7 +773,7 @@ "metadata": {}, "outputs": [], "source": [ - "f, ax = plt.subplots(figsize=(10, 6));\n", + "f, ax = plt.subplots(figsize=(10, 6))\n", "df_runoff.plot.area(ax=ax, color=sns.color_palette(\"rocket\")); plt.xlabel('Years'); plt.ylabel('Runoff (Mt)'); plt.title(rgi_ids[0]);" ] }, @@ -812,6 +811,11 @@ "monthly_runoff.clip(0).plot(cmap='Blues', cbar_kwargs={'label':'Mt'}); plt.xlabel('Months'); plt.ylabel('Years'); plt.title(rgi_ids[0]);" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [] + }, { "cell_type": "markdown", "metadata": {}, @@ -826,8 +830,15 @@ "## What's next?\n", "\n", "- return to the [OGGM documentation](https://docs.oggm.org)\n", - "- back to the [table of contents](welcome.ipynb)" + "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/elevation_bands_vs_centerlines.ipynb b/notebooks/tutorials/elevation_bands_vs_centerlines.ipynb index 0063670c..4faa8968 100644 --- a/notebooks/tutorials/elevation_bands_vs_centerlines.ipynb +++ b/notebooks/tutorials/elevation_bands_vs_centerlines.ipynb @@ -90,7 +90,7 @@ "cfg.PATHS['working_dir'] = utils.gettempdir(dirname='OGGM-centerlines', reset=True)\n", "\n", "# We start from prepro level 3 with all data ready - note the url here\n", - "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/centerlines/W5E5/'\n", + "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/centerlines/W5E5/per_glacier_spinup'\n", "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3, prepro_border=80, prepro_base_url=base_url)\n", "gdir_cl = gdirs[0]\n", "gdir_cl" @@ -109,7 +109,7 @@ "cfg.PATHS['working_dir'] = utils.gettempdir(dirname='OGGM-elevbands', reset=True)\n", "\n", "# Note the new url\n", - "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/elev_bands/W5E5/'\n", + "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/elev_bands/W5E5/per_glacier_spinup'\n", "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3, prepro_border=80, prepro_base_url=base_url)\n", "gdir_eb = gdirs[0]\n", "gdir_eb" @@ -135,7 +135,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Glacier length and cross section" + "## Glacier length and cross-section" ] }, { @@ -207,7 +207,7 @@ "source": [ "from oggm.shop import gcm_climate\n", "\n", - "# you can choose one of these 5 different GCMs:\n", + "# you can choose for example one of these 5 primary ISIMIP3b GCMs:\n", "# 'gfdl-esm4_r1i1p1f1', 'mpi-esm1-2-hr_r1i1p1f1', 'mri-esm2-0_r1i1p1f1' (\"low sensitivity\" models, within typical ranges from AR6)\n", "# 'ipsl-cm6a-lr_r1i1p1f1', 'ukesm1-0-ll_r1i1p1f2' (\"hotter\" models, especially ukesm1-0-ll)\n", "member = 'mri-esm2-0_r1i1p1f1' \n", @@ -227,7 +227,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For the ice dynamics simulations, the commands are exactly the same as well. The only difference is that centerlines require the more flexible \"FluxBased\" numerical model, while the elevation bands can also use the more robust \"SemiImplicit\" one. **The runs are considerabily faster with the elevation bands flowlines.**" + "For the ice dynamics simulations, the commands are exactly the same as well. The only difference is that centerlines require the more flexible \"FluxBased\" numerical model, while the elevation bands can also use the more robust \"SemiImplicit\" one. **The runs are considerably faster with the elevation bands flowlines.**" ] }, { @@ -250,7 +250,7 @@ "\n", " workflow.execute_entity_task(tasks.run_from_climate_data, [gdir],\n", " output_filesuffix='_historical', \n", - " );\n", + " )\n", "\n", " for ssp in ['ssp126', 'ssp370', 'ssp585']:\n", " rid = f'_ISIMIP3b_{member}_{ssp}'\n", @@ -277,11 +277,11 @@ "for ssp in ['ssp126','ssp370', 'ssp585']:\n", " rid = f'_ISIMIP3b_{member}_{ssp}'\n", " with xr.open_dataset(gdir_cl.get_filepath('model_diagnostics', filesuffix=rid)) as ds:\n", - " ds.volume_m3.plot(ax=ax1, label=ssp, c=color_dict[ssp]);\n", + " ds.volume_m3.plot(ax=ax1, label=ssp, c=color_dict[ssp])\n", "for ssp in ['ssp126','ssp370', 'ssp585']:\n", " rid = f'_ISIMIP3b_{member}_{ssp}'\n", " with xr.open_dataset(gdir_eb.get_filepath('model_diagnostics', filesuffix=rid)) as ds:\n", - " ds.volume_m3.plot(ax=ax1, label=ssp, c=color_dict[ssp], ls='--');\n", + " ds.volume_m3.plot(ax=ax1, label=ssp, c=color_dict[ssp], ls='--')\n", " ax1.set_title('Glacier volume')\n", " ax1.set_xlim([2020,2100])\n", " ax1.set_ylim([0, ds.volume_m3.max().max()*1.1])\n", @@ -289,7 +289,7 @@ "for ssp in ['ssp126','ssp370', 'ssp585']:\n", " rid = f'_ISIMIP3b_{member}_{ssp}'\n", " with xr.open_dataset(gdir_cl.get_filepath('model_diagnostics', filesuffix=rid)) as ds:\n", - " ds.length_m.plot(ax=ax2, label=ssp, c=color_dict[ssp]);\n", + " ds.length_m.plot(ax=ax2, label=ssp, c=color_dict[ssp])\n", " ax2.set_ylim([0, ds.length_m.max().max()*1.1])\n", "for ssp in ['ssp126','ssp370', 'ssp585']:\n", " rid = f'_ISIMIP3b_{member}_{ssp}'\n", @@ -332,7 +332,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Both models can be reprensented with a cross-section, like this: " + "Both models can be represented with a cross-section, like this:" ] }, { @@ -408,7 +408,7 @@ "source": [ "- in the absence of additional data to better calibrate the mass balance model, using multiple centerlines is considered not useful: indeed, the distributed representation offers little advantages if the mass balance is only a function of elevation.\n", "- elevation band flowlines are now the default of most OGGM applications. It is faster, much cheaper, and more robust to use these simplified glaciers.\n", - "- elevation band flowlines cannot be represented on a map \"out of the box\". We have however developped a tool to display the changes by redistributing them on a map: have a look at [this tutorial](../tutorials/distribute_flowline.ipynb)!\n", + "- elevation band flowlines cannot be represented on a map \"out of the box\". We have however developed a tool to display the changes by redistributing them on a map: have a look at [this tutorial](../tutorials/distribute_flowline.ipynb)!\n", "- multiple centerlines can be useful for growing glacier cases and use cases where geometry plays an important role (e.g. lakes, paleo applications)." ] }, diff --git a/notebooks/tutorials/full_prepro_workflow.ipynb b/notebooks/tutorials/full_prepro_workflow.ipynb index c616c126..5f3229a1 100644 --- a/notebooks/tutorials/full_prepro_workflow.ipynb +++ b/notebooks/tutorials/full_prepro_workflow.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The OGGM workflow is best explained with an example. In the following, we will show how to apply the standard [OGGM workflow](http://docs.oggm.org/en/stable/introduction.html) to a list of glaciers. This example is meant to guide you through a first-time setup step-by-step. If you prefer not to install OGGM on your computer, you can always run this notebook in [OGGM-Edu](https://edu.oggm.org) instead!" + "The OGGM workflow is best explained with an example. In the following, we will show how to apply the standard [OGGM workflow](https://docs.oggm.org/en/stable/introduction.html) to a list of glaciers. This example is meant to guide you through a first-time setup step-by-step. If you prefer not to install OGGM on your computer, you can always run this notebook in [OGGM-Edu](https://edu.oggm.org) instead!" ] }, { @@ -159,7 +159,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We use a temporary directory for this example, but in practice you will set this working directory yourself (for example: `/home/john/OGGM_output`. The size of this directory will depend on how many glaciers you'll simulate!\n", + "We use a temporary directory for this example, but in practice you will set this working directory yourself (for example: `/home/john/OGGM_output`). The size of this directory will depend on how many glaciers you'll simulate!\n", "\n", "**This working directory is meant to be persistent**, i.e. you can stop your processing workflow after any task, and restart from an existing working directory at a later stage.\n", "\n", @@ -192,7 +192,7 @@ "Here is a list of other glaciers you might want to try out:\n", "- `RGI60-18.02342`: Tasman Glacier in New Zealand\n", "- `RGI60-11.00787`: [Kesselwandferner](https://de.wikipedia.org/wiki/Kesselwandferner) in the Austrian Alps\n", - "- `RGI60-11.00897`: [Hintereisferner](http://acinn.uibk.ac.at/research/ice-and-climate/projects/hintereisferner) in the Austrian Alps.\n", + "- `RGI60-11.00897`: [Hintereisferner](https://acinn.uibk.ac.at/research/ice-and-climate/projects/hintereisferner) in the Austrian Alps.\n", "- ... or any other glacier identifier! You can find other glacier identifiers by exploring the [GLIMS viewer](https://www.glims.org/maps/glims). See the [working with the RGI](working_with_rgi.ipynb) tutorial for an introduction on RGI IDs and the GLIMS browser.\n", "\n", "For an operational run on an RGI region, you might want to download the [Randolph Glacier Inventory](https://www.glims.org/RGI/) dataset instead, and start a run from it. This case is covered in the [working with the RGI](working_with_rgi.ipynb) tutorial." @@ -223,7 +223,7 @@ "outputs": [], "source": [ "# Where to fetch the pre-processed directories\n", - "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/centerlines/W5E5'\n", + "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/centerlines/W5E5/per_glacier_spinup/'\n", "gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3, prepro_base_url=base_url, prepro_border=80)" ] }, @@ -345,12 +345,12 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There are two different types of \"[tasks](http://docs.oggm.org/en/stable/api.html#entity-tasks)\":\n", + "There are two different types of \"[tasks](https://docs.oggm.org/en/stable/api.html#entity-tasks)\":\n", "\n", "**Entity Tasks**:\n", " Standalone operations to be realized on one single glacier entity,\n", - " independently from the others. The majority of OGGM\n", - " tasks are entity tasks. They are parallelisable: the same task can run on \n", + " independent of the others. The majority of OGGM\n", + " tasks are entity tasks. They are parallelisable: the same task can run on\n", " several glaciers in parallel.\n", "\n", "**Global Task**:\n", @@ -385,7 +385,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The task we just applied to our list of glaciers is [glacier_masks](http://docs.oggm.org/en/stable/generated/oggm.tasks.glacier_masks.html#oggm.tasks.glacier_masks). It wrote a new file in our glacier directory, providing raster masks of the glacier (among other things): " + "The task we just applied to our list of glaciers is [glacier_masks](https://docs.oggm.org/en/stable/generated/oggm.tasks.glacier_masks.html#oggm.tasks.glacier_masks). It wrote a new file in our glacier directory, providing raster masks of the glacier (among other things):" ] }, { @@ -401,7 +401,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It is also possible to apply several tasks sequentially (i.e. one after an other) on our glacier list:" + "It is also possible to apply several tasks sequentially (i.e. one after another) on our glacier list:" ] }, { @@ -541,7 +541,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "With the computed mass-balance and the flowlines, OGGM can now compute the ice thickness, based on the principles of [mass conservation and ice dynamics](http://docs.oggm.org/en/stable/inversion.html). " + "With the computed mass-balance and the flowlines, OGGM can now compute the ice thickness, based on the principles of [mass conservation and ice dynamics](https://docs.oggm.org/en/stable/inversion.html)." ] }, { @@ -646,7 +646,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's start a run driven by a the climate of the last 31 years, shuffled randomly for 200 years. This can be seen as a \"commitment\" simulation, i.e. how much glaciers will change even without further climate change:" + "Let's start a run driven by the climate of the last 31 years, shuffled randomly for 200 years. This can be seen as a \"commitment\" simulation, i.e. how much glaciers will change even without further climate change:" ] }, { @@ -699,7 +699,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We opened the file with [xarray](http://xarray.pydata.org), a very useful data analysis library based on [pandas](http://pandas.pydata.org/). For example, we can plot the volume and length evolution of both glaciers with time:" + "We opened the file with [xarray](https://xarray.pydata.org), a very useful data analysis library based on [pandas](https://pandas.pydata.org/). For example, we can plot the volume and length evolution of both glaciers with time:" ] }, { @@ -709,7 +709,7 @@ "outputs": [], "source": [ "f, (ax1, ax2) = plt.subplots(1, 2, figsize=(13, 4))\n", - "ds2000.volume.plot.line(ax=ax1, hue='rgi_id');\n", + "ds2000.volume.plot.line(ax=ax1, hue='rgi_id')\n", "ds2000.length.plot.line(ax=ax2, hue='rgi_id');" ] }, @@ -755,7 +755,7 @@ "source": [ "workflow.execute_entity_task(tasks.run_random_climate, gdirs, nyears=200,\n", " temperature_bias=0.5,\n", - " y0=2000, output_filesuffix='_p05');\n", + " y0=2000, output_filesuffix='_p05')\n", "workflow.execute_entity_task(tasks.run_random_climate, gdirs, nyears=200,\n", " temperature_bias=-0.5,\n", " y0=2000, output_filesuffix='_m05');" @@ -779,15 +779,15 @@ "source": [ "f, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(16, 4))\n", "rgi_id = 'RGI60-11.01328'\n", - "ds2000.sel(rgi_id=rgi_id).volume.plot.line(ax=ax1, hue='rgi_id', label='Commitment');\n", - "ds2000.sel(rgi_id=rgi_id).area.plot.line(ax=ax2, hue='rgi_id');\n", - "ds2000.sel(rgi_id=rgi_id).length.plot.line(ax=ax3, hue='rgi_id');\n", - "dsp.sel(rgi_id=rgi_id).volume.plot.line(ax=ax1, hue='rgi_id', label='$+$ 0.5°C');\n", - "dsp.sel(rgi_id=rgi_id).area.plot.line(ax=ax2, hue='rgi_id');\n", - "dsp.sel(rgi_id=rgi_id).length.plot.line(ax=ax3, hue='rgi_id');\n", - "dsm.sel(rgi_id=rgi_id).volume.plot.line(ax=ax1, hue='rgi_id', label='$-$ 0.5°C');\n", - "dsm.sel(rgi_id=rgi_id).area.plot.line(ax=ax2, hue='rgi_id');\n", - "dsm.sel(rgi_id=rgi_id).length.plot.line(ax=ax3, hue='rgi_id');\n", + "ds2000.sel(rgi_id=rgi_id).volume.plot.line(ax=ax1, hue='rgi_id', label='Commitment')\n", + "ds2000.sel(rgi_id=rgi_id).area.plot.line(ax=ax2, hue='rgi_id')\n", + "ds2000.sel(rgi_id=rgi_id).length.plot.line(ax=ax3, hue='rgi_id')\n", + "dsp.sel(rgi_id=rgi_id).volume.plot.line(ax=ax1, hue='rgi_id', label='$+$ 0.5°C')\n", + "dsp.sel(rgi_id=rgi_id).area.plot.line(ax=ax2, hue='rgi_id')\n", + "dsp.sel(rgi_id=rgi_id).length.plot.line(ax=ax3, hue='rgi_id')\n", + "dsm.sel(rgi_id=rgi_id).volume.plot.line(ax=ax1, hue='rgi_id', label='$-$ 0.5°C')\n", + "dsm.sel(rgi_id=rgi_id).area.plot.line(ax=ax2, hue='rgi_id')\n", + "dsm.sel(rgi_id=rgi_id).length.plot.line(ax=ax3, hue='rgi_id')\n", "ax1.legend();" ] }, @@ -807,6 +807,13 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/holoviz_intro.ipynb b/notebooks/tutorials/holoviz_intro.ipynb index d6bf3fd0..82601696 100644 --- a/notebooks/tutorials/holoviz_intro.ipynb +++ b/notebooks/tutorials/holoviz_intro.ipynb @@ -22,7 +22,7 @@ "source": [ "This notebook is intended to present a small overview of HoloViz and the capability for data exploration, with interactive plots (show difference between matplotlib and bokeh). Many parts are based on or copied from the official [HoloViz Tutorial](https://holoviz.org/tutorial/index.html) (highly recommended for a more extensive overview of the possibilities of HoloViz).\n", "\n", - "Note: In June 2019 the project name changed from [PyViz](https://pyviz.org/) to [HoloViz](https://holoviz.org/). The reason for this is explained in this [blog post](http://blog.pyviz.org/pyviz-holoviz.html)." + "Note: In June 2019 the project name changed from [PyViz](https://pyviz.org/) to [HoloViz](https://holoviz.org/). The reason for this is explained in this [blog post](https://blog.pyviz.org/pyviz-holoviz.html)." ] }, { @@ -82,7 +82,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Of course we can have a look at one variable only:" + "Of course, we can have a look at one variable only:" ] }, { @@ -115,7 +115,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see the course of the parameter but we can not tell what was the exact temperature at January and we also cannot zoom in." + "We can see the course of the parameter, but we can not tell what was the exact temperature at January, and we also cannot zoom in." ] }, { @@ -173,7 +173,7 @@ "source": [ "But at least you can use your mouse to hover over each variable and explore their values. Furthermore, by clicking on the legend the colors can be switched on/off. Still, different magnitudes make it hard to see all parameters at once.\n", "\n", - "Here the interactive features are provided by the [Bokeh](http://bokeh.pydata.org) JavaScript-based plotting library. But what's actually returned by this call is a overlay of something called a [HoloViews](http://holoviews.org) object, here specifically a HoloViews [Curve](http://holoviews.org/reference/elements/bokeh/Curve.html). HoloViews objects *display* as a Bokeh plot, but they are actually much richer objects that make it easy to capture your understanding as you explore the data." + "Here the interactive features are provided by the [Bokeh](https://bokeh.pydata.org) JavaScript-based plotting library. But what's actually returned by this call is an overlay of something called a [HoloViews](https://holoviews.org) object, here specifically a HoloViews [Curve](https://holoviews.org/reference/elements/bokeh/Curve.html). HoloViews objects *display* as a Bokeh plot, but they are actually much richer objects that make it easy to capture your understanding as you explore the data." ] }, { @@ -306,7 +306,7 @@ "As you can see, with HoloViews you don't have to select between plotting your data and working with it numerically. Any HoloViews object will let you do *both* conveniently; you can simply choose whatever representation is the most appropriate way to approach the task you are doing. This approach is very different from a traditional plotting program, where the objects you create (e.g. a Matplotlib figure or a native Bokeh plot) are a dead end from an analysis perspective, useful only for plotting. \n", "### HoloViews Elements\n", "\n", - "Holoview objects merge the visualization with the data. For an Holoview object you have to classify what the data is showing. A Holoview object could be initialised in several ways: \n", + "Holoview objects merge the visualization with the data. For a Holoview object you have to classify what the data is showing. A Holoview object could be initialised in several ways:\n", "\n", "```\n", "hv.Element(data, kdims=None, vdims=None, **kwargs)\n", @@ -314,7 +314,7 @@ "\n", "This standard signature consists of the same five types of information:\n", "\n", - "- **``Element``**: any of the dozens of element types shown in the [reference gallery](http://holoviews.org/reference/index.html).\n", + "- **``Element``**: any of the dozens of element types shown in the [reference gallery](https://holoviews.org/reference/index.html).\n", "- **``data``**: your data in one of a number of formats described below, such as tabular dataframes or multidimensional gridded Xarray or Numpy arrays.\n", "- **``kdims``**: \"key dimension(s)\", also called independent variables or index dimensions in other contexts---the values for which your data was measured.\n", "- **``vdims``**: \"value dimension(s)\", also called dependent variables or measurements---what was measured or recorded for each value of the key dimensions. \n", @@ -338,7 +338,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The example also shows two ways of labeling the variables, one is directly by the initialisation with tuples ```('x','x_label')``` and ```('y','y_label')``` and a other option is to use ```.redim.label()```.\n", + "The example also shows two ways of labeling the variables, one is directly by the initialisation with tuples ```('x','x_label')``` and ```('y','y_label')``` and another option is to use ```.redim.label()```.\n", "\n", "The example above also shows the simple syntax to create a layout of different Holoview Objects by using `+`. With `*` you can simply overlay the objects in one plot:" ] @@ -360,7 +360,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "With ```.opts()``` you can change some characteristics of the Holoview Objects and you can use the `[tab]` key completion to see, what options are available or you can use the ```hv.help()``` function to get more information about some `Elements`." + "With ```.opts()``` you can change some characteristics of the Holoview Objects and you can use the `[tab]` key completion to see, what options are available, or you can use the ```hv.help()``` function to get more information about some `Elements`." ] }, { @@ -399,9 +399,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "So here we created a ```Curve``` Element for some Parameters and put them together in subplots by using `+` and overlay some in one subplot with `*`. With ```.opts()``` I define the color of some parameters and set the ```width``` and ```height``` propertie for the used ```Curve``` Elements and with ```.cols()``` I define the number of columns. \n", + "So here we created a ```Curve``` Element for some Parameters and put them together in subplots by using `+` and overlay some in one subplot with `*`. With ```.opts()``` I define the color of some parameters and set the ```width``` and ```height``` properties for the used ```Curve``` Elements and with ```.cols()``` I define the number of columns.\n", "\n", - "Now we can zoom in and use a hover for data exploration and because all Holoview Objects using the same dataframe and the same key variable the x-axes of all plots are linked. So when you zoom in in one plot all the other plots are zoomed in as well.\n", + "Now we can zoom in and use a hover for data exploration and because all Holoview Objects using the same dataframe and the same key variable the x-axes of all plots are linked. So when you zoom in one plot all the other plots are zoomed in as well.\n", "\n", "### HoloView Dataset and HoloMap Objects\n", "\n", @@ -506,7 +506,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Here now no widget is created, instead there is a interactive legend where we can turn the color *on* by clicking in the legend on it. So we can compare the months with each other (for example the same month in different years).\n", + "Here now no widget is created, instead there is an interactive legend where we can turn the color *on* by clicking in the legend on it. So we can compare the months with each other (for example the same month in different years).\n", "\n", "It is also easy to look at some mean values, for example looking at mean diurnal values for each month and year you can use ```.aggregate```, which combine the values after the given function:" ] @@ -590,7 +590,7 @@ "source": [ "### Tile sources\n", "\n", - "Tile sources are very convenient ways to provide geographic context for a plot and they will be familiar from the popular mapping services like Google Maps and Openstreetmap. The ``WMTS`` element provides an easy way to include such a tile source in your visualization simply by passing it a valid URL template. GeoViews provides a number of useful tile sources in the ``gv.tile_sources`` module:" + "Tile sources are very convenient ways to provide geographic context for a plot, and they will be familiar from the popular mapping services like Google Maps and OpenStreetMap. The ``WMTS`` element provides an easy way to include such a tile source in your visualization simply by passing it a valid URL template. GeoViews provides a number of useful tile sources in the ``gv.tile_sources`` module:" ] }, { @@ -669,7 +669,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "And so similar a visualisation is stored for each GeoView Element, which can be used like an HoloView Object. So as a last example you also can plot all European glaciers in one interactive plot by using an Polygons Element of GeoViews:" + "And so similar a visualisation is stored for each GeoView Element, which can be used like an HoloView Object. So as a last example you also can plot all European glaciers in one interactive plot by using a Polygons Element of GeoViews:" ] }, { @@ -686,7 +686,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "So this only was a very small look at the capability of HoloViz for data exploration and visualisation. There are much more you can do with HoloViz, but I think it is a package you should have a look at, because with only a few lines of code you can create an interactive plot which allow you to have an quick but also deep look at your data. I really recommend to visit the official [HoloViz Tutorial](https://holoviz.org/tutorial/index.html) and start using HoloViz :)" + "So this only was a very small look at the capability of HoloViz for data exploration and visualisation. There are much more you can do with HoloViz, but I think it is a package you should have a look at, because with only a few lines of code you can create an interactive plot which allow you to have a quick but also deep look at your data. I really recommend to visit the official [HoloViz Tutorial](https://holoviz.org/tutorial/index.html) and start using HoloViz :)" ] }, { diff --git a/notebooks/tutorials/ingest_gridded_data_on_flowlines.ipynb b/notebooks/tutorials/ingest_gridded_data_on_flowlines.ipynb index ae23abbe..d968986d 100644 --- a/notebooks/tutorials/ingest_gridded_data_on_flowlines.ipynb +++ b/notebooks/tutorials/ingest_gridded_data_on_flowlines.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "After running our OGGM experiments we often want to compare the model output to other gridded observations or maybe we want to use additional data sets that are not currently in the [OGGM shop](https://docs.oggm.org/en/stable/input-data.html) to calibrate parameters in the model (e.g. Glen A creep parameter, sliding parameter or the calving constant of proportionality). If you are looking on ways or ideas on how to do this, you are in the right tutorial!\n", + "After running our OGGM experiments we often want to compare the model output to other gridded observations, or maybe we want to use additional data sets that are not currently in the [OGGM shop](https://docs.oggm.org/en/stable/input-data.html) to calibrate parameters in the model (e.g. Glen A creep parameter, sliding parameter or the calving constant of proportionality). If you are looking on ways or ideas on how to do this, you are in the right tutorial!\n", "\n", "In OGGM, a local map projection is defined for each glacier entity in the RGI inventory following the methods described in [Maussion and others (2019)](https://gmd.copernicus.org/articles/12/909/2019/). The model uses a Transverse Mercator projection centred on the glacier. A lot of data sets, especially those from Polar regions can have a different projections and if we are not careful, we would be making mistakes when we compare them with our model output or when we use such data sets to constrain our model experiments.\n", "\n", @@ -63,7 +63,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Lets define the glaciers for the run " + "## Let's define the glaciers for the run" ] }, { @@ -92,7 +92,7 @@ "from_prepro_level = 3\n", "# URL of the preprocessed gdirs\n", "# we use elevation bands flowlines here\n", - "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2023.3/elev_bands/W5E5'\n", + "base_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/L3-L5_files/2025.6/elev_bands/W5E5/per_glacier'\n", "gdirs = workflow.init_glacier_directories(rgi_ids,\n", " from_prepro_level=from_prepro_level,\n", " prepro_base_url=base_url,\n", @@ -118,7 +118,8 @@ }, "outputs": [], "source": [ - "graphics.plot_googlemap(gdir, figsize=(8, 7))" + "graphics.plot_googlemap(gdir, figsize=(8, 7))\n", + "# You have to manually add an API KEY. If you run it on jupyter hub or binder, we do that for you." ] }, { @@ -227,7 +228,7 @@ " output_folder=None, # by default the final file is saved at cfg.PATHS['working_dir']\n", " output_filename='gridded_data_merged', # the default file is saved as gridded_data_merged.nc\n", " included_variables='all', # you also can provide a list of variables here\n", - " add_topography=False, # here we can add topography for the new extend\n", + " add_topography=False, # here we can add topography for the new extent\n", " reset=False, # set to True if you want to overwrite an already existing file (for playing around)\n", ")" ] @@ -281,7 +282,7 @@ "source": [ "## Add data from OGGM-Shop: bed topography data\n", "\n", - "Additionally to the data produced by the model, the [OGGM-Shop](https://docs.oggm.org/en/stable/input-data.html) counts with routines that will automatically download and reproject other useful data sets into the glacier projection (For more information also check out this [notebook](https://oggm.org/tutorials/stable/notebooks/oggm_shop.html)). This data will be stored under the file described above. " + "Additionally, to the data produced by the model, the [OGGM-Shop](https://docs.oggm.org/en/stable/input-data.html) counts with routines that will automatically download and reproject other useful data sets into the glacier projection (For more information also check out this [notebook](https://oggm.org/tutorials/stable/notebooks/oggm_shop.html)). This data will be stored under the file described above." ] }, { @@ -376,7 +377,7 @@ "\n", "If you want more velocity products, feel free to open a new topic on the OGGM issue tracker!\n", "\n", - "> this will download severals large datasets **depending on your connection, it might take some time** ..." + "> this will download several large datasets **depending on your connection, it might take some time** ..." ] }, { @@ -389,9 +390,16 @@ "source": [ "# attention downloads data!!!\n", "from oggm.shop import millan22\n", - "workflow.execute_entity_task(millan22.velocity_to_gdir, gdirs);" + "workflow.execute_entity_task(millan22.millan_velocity_to_gdir, gdirs);" ] }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, { "cell_type": "markdown", "metadata": {}, @@ -511,7 +519,7 @@ " bin_variables=['consensus_ice_thickness', \n", " 'millan_vx',\n", " 'millan_vy'],\n", - " preserve_totals=[True, False, False] # I\"m actually not sure if preserving totals is meaningful with velocities - likely not\n", + " preserve_totals=[True, False, False] # I am actually not sure if preserving totals is meaningful with velocities - likely not\n", " # NOTE: we could bin variables according to max() as well!\n", " )" ] @@ -795,7 +803,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Inversion velocities are for a glacier at equilibrium - this is not always meaningful. Lets do a run and store the velocities with time:" + "Inversion velocities are for a glacier at equilibrium - this is not always meaningful. Let's do a run and store the velocities with time:" ] }, { @@ -913,6 +921,13 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/inversion.ipynb b/notebooks/tutorials/inversion.ipynb index d192bbfb..84ab1e44 100644 --- a/notebooks/tutorials/inversion.ipynb +++ b/notebooks/tutorials/inversion.ipynb @@ -26,7 +26,7 @@ "\n", "There is no reason to think that the ice parameters are the same between\n", "neighboring glaciers. There is currently no \"good\" way to calibrate them,\n", - "or at least no generaly accepted one.\n", + "or at least no generally accepted one.\n", "We won't discuss the details here, but we provide a script to illustrate\n", "the sensitivity of the model to this choice.\n", "\n", @@ -82,7 +82,7 @@ "# (we specifically need `geometries.pkl` in the gdirs)\n", "cfg.PARAMS['border'] = 80\n", "base_url = ('https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.6/'\n", - " 'L3-L5_files/2023.3/centerlines/W5E5/')\n", + " 'L3-L5_files/2025.6/centerlines/W5E5/per_glacier_spinup') # todo<-- here is an issue with preprocessed gdir...todo\n", "gdirs = workflow.init_glacier_directories(rgidf, from_prepro_level=3,\n", " prepro_base_url=base_url)\n", "\n", @@ -141,7 +141,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The data are stored as csv files in the working directory. The easiest way to read them is to use [pandas](http://pandas.pydata.org/)!" + "The data are stored as csv files in the working directory. The easiest way to read them is to use [pandas](https://pandas.pydata.org/)!" ] }, { @@ -307,7 +307,7 @@ "metadata": {}, "outputs": [], "source": [ - "dftot.plot();\n", + "dftot.plot()\n", "plt.xlabel('Factor of Glen A (default 1)'); plt.ylabel('Regional volume (km$^3$)');" ] }, @@ -435,11 +435,11 @@ "metadata": {}, "outputs": [], "source": [ - "# save the distributed ice thickness into a geotiff file\n", + "# save the distributed ice thickness into a geotiff file\n", "workflow.execute_entity_task(tasks.gridded_data_var_to_geotiff, gdirs, varname='distributed_thickness')\n", "\n", - "# The default path of the geotiff file is in the glacier directory with the name \"distributed_thickness.tif\"\n", - "# Let's check if the file exists\n", + "# The default path of the geotiff file is in the glacier directory with the name \"distributed_thickness.tif\"\n", + "# Let's check if the file exists\n", "for gdir in gdirs:\n", " path = os.path.join(gdir.dir, 'distributed_thickness.tif')\n", " assert os.path.exists(path)" @@ -485,7 +485,8 @@ "rgi_ids = ['RGI60-11.0{}'.format(i) for i in range(3205, 3211)]\n", "sel_gdirs = [gdir for gdir in gdirs if gdir.rgi_id in rgi_ids]\n", "graphics.plot_googlemap(sel_gdirs)\n", - "# you might need to install motionless if it is not yet in your environment" + "# you might need to install motionless if it is not yet in your environment\n", + "# You have to manually add an API KEY. If you run it on jupyter hub or binder, we do that for you." ] }, { @@ -515,7 +516,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This is however not always very useful because OGGM can only plot on a map as large as the local glacier map of the first glacier in the list. See [this issue](https://github.com/OGGM/oggm/issues/1007) for a discussion about why. In this case, we had a large enough border, and like that all neighboring glacers are visible." + "This is however not always very useful because OGGM can only plot on a map as large as the local glacier map of the first glacier in the list. See [this issue](https://github.com/OGGM/oggm/issues/1007) for a discussion about why. In this case, we had a large enough border, and like that all neighboring glaciers are visible." ] }, { @@ -615,6 +616,13 @@ "- return to the [OGGM documentation](https://docs.oggm.org)\n", "- back to the [table of contents](../welcome.ipynb)" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/notebooks/tutorials/ioggm.ipynb b/notebooks/tutorials/ioggm.ipynb index 431533b0..4a8f745e 100644 --- a/notebooks/tutorials/ioggm.ipynb +++ b/notebooks/tutorials/ioggm.ipynb @@ -9,7 +9,7 @@ "\n", "This tutorial gives you the tools to run IGM within OGGM and also compare it with OGGM runs. \n", "\n", - "**This is very much work in progress.** You'll need an IGM installation for this to run. The notebook currently does not run on OGGM Hub, because of the Tensorflow depedency. We are working on it!" + "**This is very much work in progress.** You'll need an IGM installation for this to run. The notebook currently does not run on OGGM Hub, because of the Tensorflow dependency. We are working on it!" ] }, { @@ -240,7 +240,7 @@ "\n", "# set values outside the glacier to np.nan\n", "# using the glacier mask, as otherwise there is more ice from surrounding glaciers in the domain, \n", - "# which shouldn't accumulate more ice, still adds to the total volume/area of the domain.. either mask it out beforehand or before doing plots.\n", + "# which shouldn't accumulate more ice, still adds to the total volume/area of the domain ... either mask it out beforehand or before doing plots.\n", "# experiment with it: does the mass outside of the mask only decrease? => ?\n", "gd['cook23_thk_masked'] = xr.where(gd.glacier_mask, gd.cook23_thk, np.nan)\n", "\n" diff --git a/notebooks/tutorials/kcalving_parameterization.ipynb b/notebooks/tutorials/kcalving_parameterization.ipynb index dd46b671..53c2d88b 100644 --- a/notebooks/tutorials/kcalving_parameterization.ipynb +++ b/notebooks/tutorials/kcalving_parameterization.ipynb @@ -13,7 +13,7 @@ "source": [ "