- Lab
- A Cloud Guru

Create an Experiment in Azure Machine Learning
The process of experimenting on a model's architecture is hyperparameter tuning. Azure Machine Learning Studio provides Experiments as a way to track the results of hyperparameter tuning. In this lab, we teach how to set up and run a hyperparameter tuning job using Experiments. Then we view and evaluate the results in the Azure Portal.

Path Info
Table of Contents
-
Challenge
Azure Setup
- Open Machine Learning Studio. Use the Preview Environment to become familiar with the new look and feel. Note: Create the workspace in the same region as your lab provided resource group.
- Create a Compute instance.
- The Compute must be uniquely named. We can use the name of the Machine Learning Workspace as part of the Compute name to guarantee uniqueness.
- This will take a few minutes to spin up. Grab a coffee or tea while waiting.
- Clone the Jupyter Notebook from GitHub.
- Go to the Notebooks section of Machine Learning Studio.
- Create a new Python notebook and edit it in Jupyter.
- In the first cell, run the following:
!git clone https://github.com/linuxacademy/content-dp100
- After the repo is cloned, close this notebook and open the
content-dp100/notebooks/MNIST_AzureExperiment.ipynb
notebook, choosing to edit it in Jupyter.
-
Challenge
Create and Run the Experiment
- Follow the steps in the notebook.
- Select
Python 3.8 - AzureML
in the top right corner. - Run each code cell in the
MNIST_AzureExperiment
notebook and view the results. - Read the explanations to get a better understanding of what each cell is doing.
- Select
- Follow the steps in the notebook.
-
Challenge
Evaluate the Results Of the Experiment
- View the results in Machine Learning Studio.
- The final cell of the notebook provides a link directly to the
mnist
experiment. Run the cell and click the link. - Change the graph showing
batch_size
to instead showaccuracy
. - Remove the
Compute target
,Job type
, andCreated by
columns from the table. - Add the
accuracy
column to the table.
- The final cell of the notebook provides a link directly to the
- Evaluate how the changing
batch_size
affects theloss
,accuracy
, and training time. What would explain this phenomenon? - Do Further Research.
The larger batch sizes are prone to becoming stuck in local minima or missing minima entirely. The models train much faster because less backpropagation has to happen in the network, but the cost is accuracy. Batch sizes between 32 and 256 are good starting points for most models, but this requires tuning like all hyperparameters.
- View the results in Machine Learning Studio.
What's a lab?
Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.
Provided environment for hands-on practice
We will provide the credentials and environment necessary for you to practice right within your browser.
Guided walkthrough
Follow along with the author’s guided walkthrough and build something new in your provided environment!
Did you know?
On average, you retain 75% more of your learning if you get time for practice.

