Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -90,21 +90,28 @@ and use it to connect to your database.

#### Step 1: Navigate to Apps

In your Snowflake account, Go to `Data Products` > `Apps` to open the Native Apps collection. If Posit Workbench is not already installed, click `Get`. Please note that the Native App must be [installed and configured ](https://docs.posit.co/ide/server-pro/integration/snowflake/native-app/install.html)by an administrator.
In your Snowflake account, Go to `Catalog` > `Apps` to open the Native Apps collection. If Posit Workbench is not already installed, click `Browse all apps` and search for `Posit Workbench` to find it. Please note that the Native App must be [installed and configured](https://docs.posit.co/ide/server-pro/integration/snowflake/native-app/install.html) by an administrator.

![](assets/snowflake/05-get_posit_workbench.png)
![](assets/snowflake/v2-05-find_posit_workbench.png)

#### Step 2: Open the Posit Workbench Native App
#### Step 2: Activate and launch Posit Workbench Native App

Once Posit Workbench is installed, click on the app under `Installed Apps` to launch the app. If you do not see the Posit Workbench app listed, ask your Snowflake account administrator for access to the app.

![](assets/snowflake/06-open_app.png)
After clicking on the app, you will see a page with activation steps and a blue `Activate` button.

After clicking on the app, you will see a page with configuration instructions and a blue `Launch app` button.
![](assets/snowflake/v2-06-Activate_App.png)

![](assets/snowflake/09-launch-app.png)
Click on `Activate`. This should now take you to the next step where you can see a `Launch app` button.
![](assets/snowflake/v2-07-Launch_App.png)

Click on `Launch app`. This should take you to the webpage generated for the Workbench application. You may be prompted to first login to Snowflake using your regular credentials or authentication method.
To setup Oauth, click on `Connections` tab, and then click on `Configure` in section titeld `Snowflake OAuth Integration`. You will then be prompted to enter your username and password along with instrucions to setup OAuth.

![](assets/snowflake/v2-08-OAuth-setup.png)

![](assets/snowflake/v2-09-OAuth-username-password.png)

Finally, to launch the app click on `Launch app`. This should take you to the webpage generated for the Workbench application. You may be prompted to first login to Snowflake using your regular credentials or authentication method.

### Create an RStudio Pro Session

Expand All @@ -130,6 +137,17 @@ Follow the sign in prompts.

![](assets/posit_workbench/03-snowflake_login.png)

If sign-in fails: verify OAuth and network policy.

If the Snowflake sign-in in RStudio doesn’t complete (the Snowflake button doesn’t turn blue) or you see 403/blocked errors:
* OAuth setup for the app: Ensure the Posit Workbench Native App is configured to use OAuth so users can authenticate from inside RStudio. Follow the install steps: [Install with OAuth](https://docs.posit.co/partnerships/snowflake/workbench/native-app/install.html#oauth).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For this quickstart, OAuth setup is required, could we also add this step in Step 2: Activate and launch Posit Workbench Native App?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added instructions and a couple of screenshots.

* Account network policy: If your account enforces a network policy, allow connections from the Posit Workbench Native App back to Snowflake. Guidance and remediation: [Account network policies](https://docs.posit.co/partnerships/snowflake/workbench/native-app/troubleshooting.html#account-network-policies).

Try again: After updating OAuth and/or network policy, return to RStudio in Posit Workbench and click the Snowflake button to sign in.

Tip: Most sign-in issues are due to incomplete OAuth configuration or a restrictive network policy. Ask an admin to review both if you don’t have permissions.


When you're successfully signed into Snowflake, the Snowflake button will turn blue
and there will be a checkmark in the upper-left corner.

Expand Down Expand Up @@ -585,6 +603,8 @@ To run the app, open `app.R` and then click the Run App button at the top of the

![](assets/shiny/run.png)

If you face package missing errors like this: `there is no package called ‘xyz’`, please install the required package by running `install.packages("xyz")` in the R console.

Change the metric in the sidebar to control which metric is plotted.

### Learn More About Shiny
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ tags: Getting Started, Gen AI, Data Engineering, Snowpark , Python , Snowflake C

Duration: 15

**Note:** We recommend checking out our [Building Cortex AISQL Powered Call Centre Analytics Solution](https://quickstarts.snowflake.com/guide/building-cortex-aisql-powered-call-centre-analytics/index.html?index=..%2F..index#0) quickstart, which uses Snowflake's `AI_TRANSCRIBE` function instead of hosting Whisper models in SPCS for audio to text conversion. This original version is useful if Cortex cannot be used or if further model tuning is desirable.

Audio files in call centers offer rich insights beyond text. With Snowflake Cortex Functions and running open source LLM models in Snowpark Container Services, you can extract summary of the call, sentiment, and patterns which can eventually help in enhancing customer experiences. By transcribing audio to text and developing custom analytics, call centres and supervisor gain actionable insights on agent responses and proactive issue resolution, ultimately driving better customer satisfaction. These insights can inform strategic decision-making, improve operational efficiency, and drive revenue growth. From optimizing agent performance to predicting customer behavior, the possibilities are endless.

In this quick start you focus on a scenario where you are a supervisor in a vehicle insurance call centre company. As a supervisor you have to identify metrics about the agent , track few key metrics like Average Handle Time(AHT), total number of first call resolution, Sentiment count to name a few. With the help of running Whisper model with in Snowpark Container Services we can transcribe the text from audio, get the call duration and using Snowflake Cortex functions supervisor can get all these details with a help of a Streamlit App. Supervisors also get options to ask questions on the extracted audio files using natural language. Following is the solution diagram for building this solution end-end in Snowflake.
Expand Down Expand Up @@ -107,7 +109,9 @@ git clone https://github.com/Snowflake-Labs/sfguide-call-centre-analytics-with-s
Open a terminal and run the following commands to create a conda virtual environment and install few packages

```shell
conda create --name demosnowparkdemo --override-channels -c https://repo.anaconda.com/pkgs/snowflake python=3.8
conda create --name demosnowparkdemo --override-channels -c https://repo.anaconda.com/pkgs/snowflake python=3.10

conda activate demosnowparkdemo

conda install snowflake-snowpark-python pandas pyarrow streamlit
```
Expand Down Expand Up @@ -152,7 +156,7 @@ In this step we are hosting a LLM model from [NumberStation](https://huggingface

Run the following notebook to bring up the text2sql Container and creating other objects required to run the container in SPCS.

> This is required if you want the text2sql capabilities from the streamlit app. You can skip this step if you don't want to explore this feature. This capability will be soon replaced by Cortex Text2SQL function once its aviable in public preview. This steps helps you to learn on how you can host your own LLM model, fine tune on your dataset and eventually do the model inference all inside Snowflake platform.
> This is required if you want the text2sql capabilities from the streamlit app. You can skip this step if you don't want to explore this feature. This capability is now offered by Cortex Text2SQL function. These steps help you learn how you can host your own LLM model, fine tune on your dataset and eventually do the model inference all inside Snowflake platform.

[text2sql_setup_code.ipynb](https://github.com/Snowflake-Labs/sfguide-call-centre-analytics-with-snowflake-cortex-and-spcs/blob/main/text2sql/text2sql_setup_code.ipynb) found in **text2sql** folder in your cloned local directory.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ For more information on these objects, check out [this article](https://medium.c

How are customers and partners using Snowpark Container Services today? Containerized services on Snowflake open up the opportunity to host and run long-running services, like front-end web applications, all natively within your Snowflake environment. Customers are now running GPU-enabled machine learning and AI workloads, such as GPU-accelerated model training and open-source Large Language Models (LLMs) as jobs and as service functions, including fine-tuning of these LLMs on your own Snowflake data, without having to move the data to external compute infrastructure. Snowpark Container Services are an excellent path for deploying applications and services that are tightly coupled to the Data Cloud.

Note that while in this quickstart, we will predominantly use the direct SQL commands to interact with Snowpark Container Services and their associated objects, there is also [Python API support](https://docs.snowflake.com/developer-guide/snowflake-python-api/snowflake-python-overview) that you can also use. Refer to the [documentation](https://docs.snowflake.com/developer-guide/snowflake-python-api/snowflake-python-overview) for more info.
Note that while in this quickstart, we will predominantly use the direct SQL commands to interact with Snowpark Container Services and their associated objects, there is also [Python API support](https://docs.snowflake.com/developer-guide/snowflake-python-api/snowflake-python-overview) that you can use. Refer to the [documentation](https://docs.snowflake.com/developer-guide/snowflake-python-api/snowflake-python-overview) for more info.

### What you will learn
- The basic mechanics of how Snowpark Container Services works
Expand Down Expand Up @@ -166,11 +166,7 @@ Duration: 10
# test the connection:
snow connection test --connection "CONTAINER_hol"
```
If you encounter an error requesting MFA for your account, don't worry! There are several authentication methods available in Snowflake. For alternative ways to sign in, please refer to documentation at [Additional ways to authenticate your connection](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#additional-ways-to-authenticate-your-connection).

Below is an example of how to authenticate using a [MFA passcode](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#use-multi-factor-authentication-mfa) with [MFA caching](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#use-mfa-caching) enabled.

Follow step 1 in [MFA passcode](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#use-multi-factor-authentication-mfa) to setup mfa authenticator for your account. To enable MFA caching edit the “authenticator” parameter in the connection profile in config.toml file to “username_password_mfa”,
If you encounter an error requesting MFA for your account, don't worry! There are several authentication methods available in Snowflake. For alternative ways to sign in, please refer to documentation at [Additional ways to authenticate your connection](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#additional-ways-to-authenticate-your-connection). Below is an example of how to authenticate using a [MFA passcode](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#use-multi-factor-authentication-mfa) with [MFA caching](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#use-mfa-caching) enabled. Follow step 1 in [MFA passcode](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections#use-multi-factor-authentication-mfa) to setup mfa authenticator for your account. To enable MFA caching edit the “authenticator” parameter in the connection profile in config.toml file to “username_password_mfa”,
```bash
name : CONTAINER_hol
...
Expand All @@ -190,9 +186,9 @@ Duration: 10
snow connection test --connection "CONTAINER_hol" --mfa-passcode <pass-code>
```

6. Start docker via opening Docker Desktop.
4. Start docker via opening Docker Desktop.

7. Test that we can successfully login to the image repository we created above, `CONTAINER_HOL_DB.PUBLIC.IMAGE_REPO`. Run the following using Snowflake VSCode Extension or in a SQL worksheet and copy the `repository_url` field, then execute a `snow spcs image-registry login` from the terminal:
5. Test that we can successfully login to the image repository we created above, `CONTAINER_HOL_DB.PUBLIC.IMAGE_REPO`. Run the following using Snowflake VSCode Extension or in a SQL worksheet and copy the `repository_url` field, then execute a `snow spcs image-registry login` from the terminal:
```sql
// Get the image repository URL
use role CONTAINER_user_role;
Expand All @@ -204,7 +200,7 @@ Duration: 10
snow spcs image-registry login --connection CONTAINER_hol
> prompt for password
```
**Note the difference between `REPOSITORY_URL` (`org-account.registry.snowflakecomputing.com/container_hol_db/public/image_repo`) and `SNOWFLAKE_REGISTRY_HOSTNAME` (`org-account.registry.snowflakecomputing.com`)**
**Note** the difference between `REPOSITORY_URL` (`org-account.registry.snowflakecomputing.com/container_hol_db/public/image_repo`) and `SNOWFLAKE_REGISTRY_HOSTNAME` (`org-account.registry.snowflakecomputing.com`)

<!-- ------------------------ -->
## Build and Run Jupyter Service
Expand Down Expand Up @@ -238,7 +234,7 @@ ENTRYPOINT ["jupyter", "notebook","--allow-root","--ip=0.0.0.0","--port=8888","-
```
This is just a normal Dockerfile, where we install some packages, change our working directory, expose a port, and then launch our notebook service. There's nothing unique to Snowpark Container Services here!

Let's build and test the image locally from the terminal. **Note it is a best practice to tag your local images with a `local_repository` prefix.** Often, users will set this to a combination of first initial and last name, e.g. `jsmith/my-image:latest`. Navigate to your local clone of `.../sfguide-intro-to-snowpark-container-services/src/jupyter-snowpark` and run a Docker build command:
Let's build and test the image locally from the terminal. **Note:** It is a best practice to tag your local images with a `local_repository` prefix. Often, users will set this to a combination of first initial and last name, e.g. `jsmith/my-image:latest`. Navigate to your local clone of `.../sfguide-intro-to-snowpark-container-services/src/jupyter-snowpark` and run a Docker build command:
```bash
cd .../sfguide-intro-to-snowpark-container-services/src/jupyter-snowpark
docker build --platform=linux/amd64 -t <local_repository>/python-jupyter-snowpark:latest .
Expand Down Expand Up @@ -448,7 +444,7 @@ if __name__ == '__main__':

The only thing unique to Snowflake about this container, is that the REST API code expects to receive requests in the format that [Snowflake External Function](https://docs.snowflake.com/en/sql-reference/external-functions-data-format#body-format) calls are packaged, and must also package the response in the expected format so that we can use it as a Service Function. **Note this is only required if you intend to interact with the API via a SQL function**.

Let's build and test the image locally from the terminal. **Note it is a best practice to tag your local images with a `local_repository` prefix.** Often, users will set this to a combination of first initial and last name, e.g. `jsmith/my-image:latest`. Navigate to your local clone of `.../sfguide-intro-to-snowpark-container-services/src/convert-api` and run a Docker build command:
Let's build and test the image locally from the terminal. **Note:** It is a best practice to tag your local images with a `local_repository` prefix. Often, users will set this to a combination of first initial and last name, e.g. `jsmith/my-image:latest`. Navigate to your local clone of `.../sfguide-intro-to-snowpark-container-services/src/convert-api` and run a Docker build command:
```bash
cd .../sfguide-intro-to-snowpark-container-services/src/convert-api
docker build --platform=linux/amd64 -t <local_repository>/convert-api:latest .
Expand Down Expand Up @@ -479,7 +475,7 @@ Now that we have a local version of our container working, we need to push it to
> prompt for password
docker tag <local_repository>/convert-api:latest <repository_url>/convert-api:dev
```
**Note the difference between `REPOSITORY_URL` (`org-account.registry.snowflakecomputing.com/container_hol_db/public/image_repo`) and `SNOWFLAKE_REGISTRY_HOSTNAME` (`org-account.registry.snowflakecomputing.com`)**
**Note** the difference between `REPOSITORY_URL` (`org-account.registry.snowflakecomputing.com/container_hol_db/public/image_repo`) and `SNOWFLAKE_REGISTRY_HOSTNAME` (`org-account.registry.snowflakecomputing.com`)

Verify that the new tagged image exists by running:
```bash
Expand Down Expand Up @@ -618,37 +614,37 @@ Duration: 5

There are a number of useful functions we should explore with respect to controlling the service itself from SQL. More information on SQL commands can be found at [Snowpark Container Services SQL Commands](https://docs.snowflake.com/en/sql-reference/commands-snowpark-container-services#service) and [Snowpark Container Services System Functions](https://docs.snowflake.com/en/developer-guide/snowpark-container-services/overview#what-s-next)

1. Get the status of your container using:
1. Get the status of your container using:<br>

From a SQl console:
From a SQL console:

```sql
SHOW SERVICE CONTAINERS IN SERVICE JUPYTER_SNOWPARK_SERVICE;
```

2. Check the status of the logs with:
2. Check the status of the logs with:<br>

From a SQl console:
From a SQL console:

```sql
CALL SYSTEM$GET_SERVICE_LOGS('CONTAINER_HOL_DB.PUBLIC.JUPYTER_SNOWPARK_SERVICE', '0', 'jupyter-snowpark',10);
```

3. Suspend your container using the ALTER SERVICE command:
3. Suspend your container using the ALTER SERVICE command:<br>

From a SQL console:

```sql
ALTER SERVICE CONTAINER_HOL_DB.PUBLIC.JUPYTER_SNOWPARK_SERVICE SUSPEND ;
```

4. Resume your container using the ALTER SERVICE command:
4. Resume your container using the ALTER SERVICE command:<br>

From a SQL console:
From a SQL console:

```sql
ALTER SERVICE CONTAINER_HOL_DB.PUBLIC.JUPYTER_SNOWPARK_SERVICE RESUME ;
```
```sql
ALTER SERVICE CONTAINER_HOL_DB.PUBLIC.JUPYTER_SNOWPARK_SERVICE RESUME ;
```

<!-- ------------------------ -->
## Stop and Suspend
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -561,7 +561,6 @@ cp .env.example .env
sed -i -e "s/{INSERT A RANDOM STRING HERE}/$(openssl rand -base64 12)/" .env
sed -i -e "s/{INSERT ANOTHER RANDOM STRING HERE}/$(openssl rand -base64 12)/" .env
```
:
```bash
SNOWFLAKE_ACCOUNT={INSERT_ACCOUNT_NAME_HERE}
SNOWFLAKE_USERNAME={INSERT_USER_NAME_HERE}
Expand Down