Getting Started
This section walks you through the process of installing the Command Line Interface (CLI) locally and using it for prompting.
Installation
Use Homebrew for macos
or wsl
if you haven't installed it already:
brew tap smartloop-ai/smartloop
brew install smartloop
For Windows
, install Python 3.11 from the official source or your package manager. Once Python 3.11 is installed, run the following command to install the CLI:
pip install -U smartloop
The Command Line Interface allows you to upload, fine-tune, and extract information from your documents with a few easy steps
Configuration
To authenticate with the Smartloop Platform, run the following command in your terminal:
smartloop login
This will open a browser window to https://app.smartloop.ai/login where you can complete the authentication process. After login, you'll be redirected back to the CLI with an authorization code that will be exchanged for a token, which will be automatically saved for future use.
If you prefer to use a manual token input, you can use:
smartloop login --no-browser
Create a Project
This is the top-level grouping where all your contents and model is configured at a project level:
smartloop projects create --name microsoft
Upload Document
Once the project is selected , upload documents from your folder or a specific file to personalized your project, in this case I am uploading the a document describing Microsoft online services form my local machine:
smartloop upload --path=~/document1.pdf
Prompting
You can start prompting by executing the following commnand:
smartloop run
This will start the interactive interface that will let you interact with the model against your uploaded document(s)
Below is an example of the interactive interface:
Microsoft(microsoft-24-07-2024)
======================================
Enter prompt (Ctrl-C to exit):
what the SLA for azure open ai
⠋
The SLA (Service Level Agreement) for Azure OpenAI is not explicitly mentioned in the provided text. However, it's possible that the SLA for Azure OpenAI might be similar to the one mentioned below:
"Uptime Percentage"
* Service Credit:
+ < 99.9%: 10%
+ < 99%: 25%
+ < 95%: 100%
Please note that this is not a direct quote from the provided text, but rather an inference based on the format and structure of the SLA mentioned for other Azure services (e.g., SAP HANA on Azure High Availability Pair). To confirm the actual SLA for Azure OpenAI, you should check the official Microsoft documentation or contact their support team.
Prompt message (Ctrl-C to exit):