This guide will cover how to deploy and configure your environment using the Azure Marketplace App: Discovery Hub and ADLS Gen2, SQL DB, AAS, and ML
The deployment of the Azure resources will vary depending on Azure resource availability. Deploying ADLS Gen2, Azure SQL Single database, Azure Analysis Services, and the Machine Learning Services Workspace generally takes ~20 minutes. Configuring your Discovery Hub environment generally takes ~30 minutes.
In this topic:
- Deploy TimeXtender with ADLS Gen2, Azure SQL DB, AAS, and ML
- Configure Accounts and Permissions
- Configure the Machine Learning Services Workspace
- Find the server name
- Connect to the virtual machine
- Configure the TimeXtender Environment
- Configure the ODX Server
- Download and configure template from CubeStore
- Configure Jupyter Notebook Inputs
1. Deploy TimeXtender with ADLS Gen2, Azure SQL DB, AAS, and ML
- Go to https://azuremarketplace.microsoft.com/en-us/marketplace/
- If you are not signed in to your account, sign in now
- In the Azure Marketplace, perform a search for TimeXtender
- Select the deployment option from the search results.
- Scroll down and select Get it Now and then select Continue. You will be redirected to a page that contains a wizard for creating your resources.
- In Step 1 of the wizard, configure basic settings, and then click OK.
7. In Step 2 of the wizard, configure the data storage settings for ADLS Gen2, and then click OK.
8. In Step 3 of the wizard, configure the Databricks settings and click OK.
9. In Step 4 of the wizard, configure the SQL Server’s resources and settings, and then click OK.
10. In Step 4 of the wizard, configure the SQL Server’s resources and settings, and then click OK.
11. In step 5 of the wizard, configure the Azure Analysis Services resources and settings, and then click OK.
12. In Step 7 of the wizard, confirm the summary settings by clicking OK.
13. In Step 8 of the wizard, review the terms of the agreements and click Create to initialize the deployment to the resource group you chose.
2. Configure Accounts and Permissions
Ensure the users, user groups, and service accounts have the necessary access and rights to the new server. Refer to Configure User Accounts & Permissions to view the requirements.
3. Configure the Machine Learning Services Workspace
Provision an Azure Notebook Virtual Machine
- Go to the Machine Learning Services Workspace, click on Notebook VMs and then click New.
- Enter a name for your new Notebook VM and then click Create
Upload the Jupyter Notebook
You must wait for the Notebook Virtual Machine to finish deploying before completing the steps below to upload the Jupyter Notebook.
1. Go to the Machine Learning Services Workspace, and click on the storage account.
2. Click on Files
3. Click on the code-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx file share
4. Upload the Jupyter Notebook (located at the end of this article) in the folder path:
code-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx / Users / <Username>
4. Find the server name
1. Once your items have been deployed to your resource group, navigate to the resource group in Azure that contains all of the deployment items. This was assigned when configuring basic settings.
2. While viewing the Resource Group that contains all of your deployment items, find the SQL Server and click on it.
3. Scroll-down the menu bar to find Properties and click on it. Once you click on Properties there will be a field called Server Name. Save this name somewhere safe because it will be needed for configuring your Discovery Hub®
5. Connect to the virtual machine
1. In order to activate and configure Discovery Hub, connect to the application server deployed on Azure. Navigate to the resource group in Azure that contains all of the deployment items.
2. Locate the Virtual Machine in the resource group and click on it.
3. While viewing the page for the virtual machine, click Connect and download the RDP
4. Open the RDP file once it has downloaded and click Connect
5. Enter in the credentials to connect to the virtual machine that you created earlier and click Ok to connect.
6. Configure the TimeXtender environment
Once you are connected to the Virtual Machine, activate your Discovery Hub instance following the instructions in this article: First Time Set-Up of TimeXtender.
7. Configure the ODX server
Please see the article, Configure and Manage the ODX Server, for configuring the ODX Server using Azure Data Lake for data storage.
8. Download and configure template from the CubeStore
- In Discovery Hub, go to File>CubeStore and download AutoML: Retail Sales Model
- Once the template has downloaded, run the Connection Wizard to edit the connection for your data warehouse and tabular endpoint.
- Point the data warehouse to the server you deployed earlier using SQL Server authentication and create the database.
- Point the tabular endpoint to the Azure Analysis Services server you created earlier. You do not need to create the database, it will be created when you deploy your project.
9. Configure Jupyter Notebook Inputs
1. Go to Machine Learning Services Workspace, find the Notebook VM you created earlier and open the Jupyter Notebook that you attached earlier.
2. In the Read Input section of the Jupyter Notebook, enter the connection parameters for the server you created earlier, along with other Azure Subscription and resource details.
3. Now that you have entered the necessary inputs into the Jupyter Notebook, everything is configured so you can run the rest of the code blocks to run AutoML to create and test the best model and predict and store the sales back into SQL. The forecasted sales data will be stored in a table called dbo.MonthlySalesForecast located in the same database you designated in the Read Inputs section in the Jupyter Notebook.
This guide covered how to configure TimeXtender with ADLS Gen2, Azure SQL DB, AAS and ML using the Azure Marketplace. Another article will soon be published that covers how to use the TimeXtender CubeStore template to map data into your Azure SQL Database, run AutoML, and integrate the output with your existing business data.