A Presentation of Dash Enterprise and Its Interface.pptx

MusaBadaru 7 views 32 slides Oct 12, 2024
Slide 1
Slide 1 of 32
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32

About This Presentation

Provide a brief explanation of how to interface with the dash platform.


Slide Content

Dash Enterprise

Preparing Your App for Dash Enterprise

Migrating From Local Development to dash enterprise 20XX 3 The first step in deploying your Dash app to Dash Enterprise is to prepare your app's code with the necessary files required to deploy your Dash app. This guide is aimed at users who already have Dash apps running locally. If you haven't already created a Dash app yet, we recommend trying out Dash Enterprise Sample Apps & Templates.

Quick start 4 Procfile  - Ensure that your project folder has a  Procfile . See Application Structure for details. This is often : web: gunicorn app:server --workers 2 requirements.txt - Ensure that your project folder has a requirements.txt file with the Python dependencies and versions. Don't forget  gunicorn ! server = app.server  - Declare the underlying server variable to be referenced by the web command in your  Procfile . Remove secrets & passwords - Remove sensitive passwords and API keys from your code and replace them with Environment Variables. System-Level dependencies - If your project requires system level dependencies like database drivers, then install those dependencies with a  predeploy  script and an apt-packages file. Self-contained - Ensure that all files and datasets that the app code references or saves are within the same project folder. Initialize your app in the Dash Enterprise App Manager UI. Deploy your app by running the git instructions generated by the App Manager.

Initializing a dash app 5 Go to the Dash Enterprise App Manager at  https://<your-dash-enterprise>/Manager . In the top right-hand corner, select  Initialize App . In the modal that appears, enter a name for your Dash app (app names must start with a lowercase letter and must contain only lowercase letters, numbers, and -). Keep in mind that this name is going to be part of the URL for your app. Select  Create .

Deploying an app To deploy an app to Dash Enterprise, you can get started by  Downloading a Sample App . Alternatively, you can select  Creating a New App  or Deploying an Existing App .

Downloading As a sample app 7 Browse the sample apps at  https://<your-dash-enterprise-hostname>/Docs/templates  (substituting  <your-dash-enterprise-hostname>  with your Dash Enterprise URL); then download and unzip the files for the app that you want to deploy. Go to the sample app folder and initialize the Git repository: $ cd <sample-app> $ git init where  <sample-app>  is the name of the folder containing the sample app files. In the Overview of the app you have  initialized , find the command for adding a Git remote (it begins with  git remote add plotly ). The full command contains a URL that changes depending on whether you're deploying over HTTPS or SSH. Select the right protocol; then copy the command. Run the command you copied in step 3 in the root of your sample folder. Deploy the app to Dash Enterprise by running the following Git commands: $ git status # view the changed files $ git diff # view the actual changed lines of code $ git add . # add all the changes $ git commit -m "<commit-message>" $ git push plotly master

Creating a new app 8 Create a new folder for your app files: $ mkdir dash-app-example $ cd dash-app-example Initialize the Git repository: $ git init Create the following files in your folder. app.py . gitignore - gitignore  determines which files and folders are ignored in Git, and therefore not copied to the server when you deploy your app. venv *. pyc . DS_Store . env Procfile Make sure you create this file without a file extension. web : gunicorn app:server --workers 4 requirements.txt . This file describes your app's Python dependencies. We recommend generating your  requirements.txt  automatically by running: $ pip freeze > requirements.txt

Creating a new app 9 (Recommended) Create a virtual environment: $ python -m venv venv A virtual environment is not required to deploy your app, but is helpful when writing and previewing your app locally. Activate it with  source venv /bin/activate  (Mac/Linux) or  source venv /Scripts/activate  (Windows), then install your app's requirements with  pip install -r requirements.txt . You now have a fresh Python environment that is dedicated to your app. Note that the virtual environment creates a  venv  directory in your app, but it is ignored by Git thanks to the  . gitignore  file. In the Overview of the app you have  initialized , find the command for adding a Git remote (it begins with  git remote add plotly ). The full command contains a URL that changes depending on whether you're deploying over HTTPS or SSH. Select the right protocol; then copy the command. Run the command you copied in step 5 in the root of your sample folder. Deploy the app to Dash Enterprise by running the following Git commands: $ git status # view the changed files $ git diff # view the actual changed lines of code $ git add . # add all the changes $ git commit -m "<commit-message>" $ git push plotly master

Deploying an Existing app 10 Go to your app folder and initialize the Git repository if you haven't already. (This creates a hidden  .git  directory in your folder that contains Git metadata. If you cloned a folder from Dash Enterprise or another source like GitHub, this metadata already exists, so you can skip this step). $ cd <app-folder> $ git init where  <app-folder>  is the name of the folder containing your app files. In the Overview of the app you have  initialized , find the command for adding a Git remote (it begins with  git remote add plotly ). The full command contains a URL that changes depending on whether you're deploying over HTTPS or SSH. Select the right protocol; then copy the command. Run the command you copied in step 3 in the root of your sample folder. Deploy the app to Dash Enterprise by running the following Git commands: $ git status # view the changed files $ git diff # view the actual changed lines of code $ git add . # add all the changes $ git commit -m "<commit-message>" $ git push plotly master

App Configuration

App Structure, Buildpacks , and Deployment Lifecycle 12 Lifecycle When you run  git push plotly master , Dash Enterprise does the following: Mount the app source code. Detect which Buildpack to use based off of files present in app root folder. This Python &  pip   buildpack is detected by discovering the  requirements.txt  file. Install Python 3.6.10 — override the version with a  runtime.txt  file. Install custom APT packages if an  apt-packages  file is provided and custom  .deb  if a  dpkg -packages  folder is provided (optional). Install Python app dependencies specified in  requirements.txt  file. Run a build script if an  app.json  file is included with a  " predeploy "  field (optional). Changes made by this script are committed to  each  Docker image. At this point, the Docker images have been created. In Dash Enterprise Kubernetes, these images are uploaded to the container registry. Create Docker containers from the Docker image on the host (Dash Enterprise Single Server) or in the Kubernetes cluster (Dash Enterprise Kubernetes). The number of containers created for each process type can be configured with the  DOKKU_SCALE  file (optional) or in the App Manager.

App Structure, Buildpacks , and Deployment Lifecycle 13 Lifecycle Run the app health checks. If the health checks fail, abort the deployment and keep the previous containers running. Override the default health checks with the  CHECKS  file (Dash Enterprise Single Server) or the  readiness  field in the  app.json  file (Dash Enterprise Kubernetes). At this point logs will no longer be available in the deployment, and are only available in the application logs via the App Manager UI. Run the commands as specified in  Procfile  in the appropriate container(s) to start the app, job queue, or background process. Run the  postdeploy  script in each container if the  app.json  file is included with a  postdeploy  field (optional) Release: Open app to web traffic. Remove the old containers and images. Run periodic  liveness  checks on Dash Enterprise Kubernetes if  app.json  includes  liveness  field, to ensure that the app is still up and to restart it if not (not available on Dash Enterprise Single Server). Restart the deployment process every 24 hours on Dash Enterprise Kubernetes to prevent long-running apps from going down (not available on Dash Enterprise Single Server)

Configuring system dependences 14 Installing APT Packages In the root of your application folder, create a file called  apt-packages . Here you may specify APT packages to be installed with one package per line. For example, to install the ODBC driver we could include an apt -packages  file that looks like: unixodbc unixodbc -dev Configuring System Dependencies You may include a pre-deploy script that executes in your Dash app's environment. For the case of adding an ODBC driver, we need to add ODBC initialization files into the correct systems paths. To do so, we include the ODBC initialization files in the app folder and then copy them into system paths in the pre-deploy script. Adding a Pre-Deploy Script Let's generate a file to do this. Note that the file can have any name as we must specify the name in an application configuration file  app.json . For the purposes of this example we assume we have named it  setup_pyodbc  and installed it in the root of our app folder. cp /app/odbc.ini / etc /odbc.ini cp /app/odbcinst.ini / etc /odbcinst.ini Running a Pre-Deploy Script Using  app.json Next we must instruct Dash Enterprise to run our  setup_pyodbc  file by adding a JSON configuration file named  app.json  into the root of our app folder. { "scripts" : { " dokku " : { " predeploy " : "/app/ setup_pyodbc " } } } Now when the app is deployed, it will install the APT packages specified in  apt-packages  file and run the setup file specified in  app.json . In this case it allows us to install and then configure the ODBC driver.

Adding static assets 15 Adding Your Own CSS and JavaScript to Dash Apps Including custom CSS or JavaScript in your Dash apps is simple. Just create a folder named  assets  in the root of your app directory and include your CSS and JavaScript files in that folder. Dash will automatically serve all of the files that are included in this folder. Embedding Images in Your Dash Apps Apps deployed on Dash Enterprise are deployed under an app name prefix. As a consequence, images that are added to the  assets  folder will require the app name prefix in the relative path. For example, with the below folder structure: -- app.py -- assets/ |-- my-image.png

Adding static assets 16 Adding Your Own CSS and JavaScript to Dash Apps Including custom CSS or JavaScript in your Dash apps is simple. Just create a folder named  assets  in the root of your app directory and include your CSS and JavaScript files in that folder. Dash will automatically serve all of the files that are included in this folder. Embedding Images in Your Dash Apps Apps deployed on Dash Enterprise are deployed under an app name prefix. As a consequence, images that are added to the  assets  folder will require the app name prefix in the relative path. For example, with the below folder structure: -- app.py -- assets/ |-- my-image.png

User interface

Dash app portal Located at  https://<your-dash-enterprise>/Portal , the Dash App Portal is the front page for your Dash Enterprise platform. It allows multiple users to prominently display their apps in one central location. In order for your app to appear on the Dash App Portal, you need to turn on  Show in Portal  in your app's settings from within the Dash Enterprise App Manager. Then, select  Add Info  to customize how your app will look in the Portal, add tags to make it easier to find, set its rank in the Portal, and more. Portal Settings The  Portal Settings  are only available to Administrators. In the  Portal Settings , you can configure privacy for your organization's Portal and customize its look and feel.

Dash app privacy Starting in version 3.0.0 of Dash Enterprise, you can restrict who is able to view your app. This also restricts who is able to see it in the  Dash App Portal . In your list of apps ( https://<your-dash-enterprise>/Manager/apps) , select the app whose privacy you want to configure; then go to its  Settings . Find the  App Privacy  setting; then choose between  Restricted ,  Authenticated , or  Unauthenticated . If you choose the  Restricted  setting, a field appears where you need to enter a list of usernames who will be authorized to view the app. Additionally, if LDAP is enabled, you can also add entire LDAP groups.

Adding redis to an app Redis is a powerful in-memory database that is well-suited for many Dash apps. In particular, you can use Redis to: Save app data that persists in memory across processes, containers, and deploys. Enable queued and background processes with Celery. Use the Dash Enterprise Snapshot Engine (see the documentation on your Dash Enterprise server for details). Cache data  from your callbacks across processes. Use  background callbacks  for callbacks that take a long time to complete. Creating and Linking Databases via the Dash Enterprise App Manager You can create and link a Redis database to your Dash app using the Dash Enterprise App Manager. Here, you have two options: Create a database before initializing an app. Create and link a database after an app has been initialized.

Adding redis to an app Creating a Database Before Initializing an App To create a database and link it at app initialization time: Go to Databases. Select Create Database. In the modal that appears, enter a name for your database (we recommend using a convention like appending  - redis  to what will be your app name). In Database Type, select Redis. Select Create. Go to Apps and initialize your app, selecting your database in Link a Database. Creating and Linking a Database After an App Has Been Initialized To create and link a database for an existing app: In your app's Settings, go to Databases. Select Create and link database. Enter a name for your database (we recommend using a convention like appending  - redis  to your app name). In Type, select Redis. Select Create and link. Using Redis inside Workspaces Dash Enterprise Workspaces share the same Redis instance as the deployed app. This enables you to inspect data of your Redis instances from within workspaces. However, it is important that you structure your code to not override Redis data. Each Redis instance has 16 available databases.

Setting environmental variable Adding Environment Variables To add environment variables via the Dash Enterprise UI, go to your app settings. Enter the environment variable name and value; then select Add. Once added, the values for your environment variables are hidden unless you turn on Show environment variables . Referencing Environment Variables in Your Code You can reference these variables with the  os.environ  module: database_password = os.environ [ 'DATABASE_PASSWORD' ] Alternatively, if the variable isn't in your environment and you want to fall back to some other value, use: database_password = os.environ.get ( 'DATABASE_PASSWORD' , 'my-default-database-password’ ) Defining Environment Variables in Your Local Environment By referencing these environment variables in our code, we'll need to add these variables to our local environment as well. One easy way to do this is to define the variables on-the-fly when you run  python app.py . That is, instead of running  python app.py , run: $ DATABASE_USER= jessica DATABASE_PASSWORD=my-password python app.py Alternatively, you can define them for your session by "exporting" them: $ export DATABASE_USER= jessica $ export DATABASE_PASSWORD=my-password $ python app.py

Advanced

Setting up ssh key for a dash enterprise Generating a New SSH Key Pair In this step, you'll generate a new SSH key pair and configure the SSH Agent to remember your passphrase so that you don't have to enter it each time you deploy your app with  git push . The instructions below assume that you are using Git Bash, which is included in the official Git for Windows release. To generate a new SSH key pair and configure the SSH Agent: In Git Bash, generate a new key pair: ssh -keygen -t rsa -b 4096 -C "[email protected]" replacing the example email address with your email address. When prompted for a location to save the key pair, press  Enter  to accept the default location. When prompted, enter a passphrase. Check that the SSH Agent is running: eval $( ssh -agent -s) Add your SSH private key to the SSH Agent: ssh -add ~/. ssh / id_rsa

Setting up ssh key for a dash enterprise Adding the SSH Public Key to Dash Enterprise In this step, you'll add the SSH public key to Dash Enterprise using the Dash Enteprise UI. To add the SSH public key to Dash Enterprise: Copy the SSH public key: clip < ~/. ssh /id_rsa.pub Go to Dash Enterprise and log in if you aren't already. Select your username > Manage SSH Public Keys. Select Add Key, paste your key, and then select Add. Modifying Your SSH Config In this step, you'll specify a custom port in your SSH config. By default, the custom port you need to specify is 3022, but your server administrator may have set it to something different. This file is located in  ~/. ssh /config . If it's not there, then create it. Add the following lines to this file, replacing  your-dash-enterprise-server  with the domain of your Dash Enterprise server (without  http://  or  https:// ). Host your-dash-enterprise-server Port 3022 Please be careful not to save your SSH config as a .txt file as it will not be recognized by Git when deploying your applications. 

Adding private python packages There are two common approaches: Using a private Python package index provided by your organization or including the Python packages directly in your app's files. Using a Private Python Package Index If your organization manages a private Python package index, you can set it as an extra index that your app checks when installing dependencies. To install a package from a private Python package index: At the top of  requirements.txt , add the following line: --extra-index- url <private-package-index> --trusted-host <private-package-index> where  <private-package-index>  is the URL to your organization's private Python package index. You can omit the  --trusted-host  flag if the index uses HTTPS. List the package name in  requirements.txt  on its own line the same way you would for a public package. For example: my-private-package Your app continues to look for packages on PyPI.org, but it now also looks in your private Python package index for any packages it can't find on PyPI.org. Including the Package in Your App Files By placing the package  tar.gz  ( tarball ) or  . whl  (wheel) file directly in your app's code, your app can install the package using this file.

Linking a celery process Celery is a reliable asynchronous task queue/job queue that supports both real-time processing and task scheduling in production systems. This makes Celery well-suited for Dash apps. For more detail on how to use Celery within Dash Enterprise, see the Snapshot Engine documentation ( https://<your-dash-enterprise>/Docs/dash-snapshots/usage-job-queue ) or the Sample Applications and Templates at  https://<your-dash-enterprise>/Docs/templates .

Connect a dash app to an sql database Managing Connection Drivers and Libraries Dash Apps can use open source libraries to query external databases and datastores in callbacks or job queues. Working with Connection Objects in a Dash App Many of the database apps follow a similar pattern: Install the database driver & system level dependencies. This will vary by database type and we recommend viewing the examples that we've provided first. Many of the database drivers more easily installed on Linux environments, and so we recommend developing these apps in a  Dash Enterprise Workspace . Create a connection. Store your connection's database password as an environment variable in your App Settings in the Dash Enterprise App Manager instead of storing it within code. Test that the database server has started successfully, verify that it is accepting connections, and validate the credentials and connection string; we can create a simple  try_connection  function which sends a basic query to the database and checks to see if it is successful. If it fails, the connection is retried after an exponentially increasing delay. This allows us to distinguish between errors arising from issues with the callback logic, and errors caused by database configuration or connectivity problems.

Connect a dash app to an sql database Working with Connection Objects in a Dash App Consider  connection pooling. In these examples above, we don't use database connection pools. Connection pools have tricky implications when using alongside  celery  or gunicorn's   --preload  option. If you aren't using  --preload  nor  celery , then you can improve your query performance by using a node pool. With a node pool, connections are shared instead of recreated and discarded. It is possible to use --preload and celery with a node pool but it requires extra caution. To disable connection pooling, you will use the  NullPool : from sqlalchemy.pool import NullPool engine = create_engine ( 'postgresql+psycopg2:// username:password@localhost /test' , pool= NullPool ) Construct your SQL query. You have several options here: Open a connection when you need to perform an operation and close it after. In  sqlalchemy , this can be done with the  with engine.connect ()  clause. While a NullPool can do this for us by default as well, we can also use the Engine object to implicitly open and close connections on an ad hoc basis for both this and other pooling implementations. Enclosed in the  with context statement, the connection object generated by the  Engine.connect ()  method is automatically closed by  Connection.close ()  at the end of the codeblock

Connect a dash app to an sql database Working with Connection Objects in a Dash App Consider  connection pooling. In these examples above, we don't use database connection pools. Connection pools have tricky implications when using alongside  celery  or gunicorn's   --preload  option. If you aren't using  --preload  nor  celery , then you can improve your query performance by using a node pool. With a node pool, connections are shared instead of recreated and discarded. It is possible to use --preload and celery with a node pool but it requires extra caution. To disable connection pooling, you will use the  NullPool : from sqlalchemy.pool import NullPool engine = create_engine ( 'postgresql+psycopg2:// username:password@localhost /test' , pool= NullPool ) Construct your SQL query. You have several options here: Open a connection when you need to perform an operation and close it after. In  sqlalchemy , this can be done with the  with engine.connect ()  clause. While a NullPool can do this for us by default as well, we can also use the Engine object to implicitly open and close connections on an ad hoc basis for both this and other pooling implementations. Enclosed in the  with context statement, the connection object generated by the  Engine.connect ()  method is automatically closed by  Connection.close ()  at the end of the codeblock

Dash app authentication Dash Enterprise automatically implements user authentication if your Dash app's privacy is set to  Restricted  or  Authenticated , but not if it is set to  Unauthenticated . You can access the authentication data within your app using the  dash-enterprise-auth  package. Using  dash-enterprise-auth  in an Existing Dash App If you have previously deployed your Dash app to Dash Enterprise, add  dash-enterprise-auth  to your  requirements.txt  file to get started. With  dash-enterprise-auth , you can use the  get_username  and  get_user_data  methods to get information about the app viewer. These methods must be called from within callbacks. dash-enterprise-auth  also includes the  create_logout_button  method, which allows you to add a logout button to your app's layout. This button uses a special URL to log the user out. To use  create_logout_button  when developing locally, you need to set an environment variable called  DASH_LOGOUT_URL . You can do this by running your code with  DASH_LOGOUT_URL=https://plotly.com python app.py  (replacing  https://plotly.com  with the URL you want to be redirected to after a logout). If you're developing in Workspaces,  DASH_LOGOUT_URL  is supplied automatically.

THANK YOU