Advent of 2024, Day 21 – Microsoft Azure AI – Prompt flow in Azure AI Foundry
This article is originally published at https://tomaztsql.wordpress.com
In this Microsoft Azure AI series:
- Dec 01: Microsoft Azure AI – What is Foundry?
- Dec 02: Microsoft Azure AI – Working with Azure AI Foundry
- Dec 03: Microsoft Azure AI – Creating project in Azure AI Foundry
- Dec 04: Microsoft Azure AI – Deployment in Azure AI Foundry
- Dec 05: Microsoft Azure AI – Deployment parameters in Azure AI Foundry
- Dec 06: Microsoft Azure AI – AI Services in Azure AI Foundry
- Dec 07: Microsoft Azure AI – Speech service in AI Services
- Dec 08: Microsoft Azure AI – Speech Studio in Azure with AI Services
- Dec 09: Microsoft Azure AI – Speech SDK with Python
- Dec 10: Microsoft Azure AI – Language and Translation in Azure AI Foundry
- Dec 11: Microsoft Azure AI – Language and Translation Python SDK
- Dec 12: Microsoft Azure AI – Vision and Document AI Service
- Dec 13: Microsoft Azure AI – Vision and Document Python SDK
- Dec 14: Microsoft Azure AI – Content safety AI service
- Dec 15: Microsoft Azure AI – Content safety Python SDK
- Dec 16: Microsoft Azure AI – Fine-tuning a model
- Dec 17: Microsoft Azure AI – Azure OpenAI service
- Dec 18: Microsoft Azure AI – Azure AI Hub and Azure AI Project
- Dec 19: Microsoft Azure AI – Azure AI Foundry management center
- Dec 20: Microsoft Azure AI – Models and endpoints in Azure AI Foundry
Prompt flow in Azure AI Foundry is development tool for designing the flows (streamlines) for the complete end-to-end development cycle of LLM’s AI application. You can create, iterate, test, orchestrate, debug, and monitor your flows.
For additional information on Prompt flow: go to: https://microsoft.github.io/promptflow/index.html
With the prompt flow you will be able to do:
- collaborate with your team by using cloud version of Prompt flow (you can also clone the repository and use it on-prem)
- create flows that link LLMs, prompts and any other tools together for executable workflows
- use the power of Python language with the
promptflow
package - evaluate your flows, calculate quality and performance metrics
- Integrate the testing and evaluation into your CI/CD systems
- Deploy flows to your serving platform (either prototype, experiment or finalised solution)
The main building blocks of Prompt flow are:
- A flow is an executable instruction set that can implement the AI logic. Flows can be created or run via multiple tools, like a prebuilt canvas, LangChain, etcetera. Iterations of a flow can be saved as assets; once deployed a flow becomes an API. Not all flows are prompt flows; rather, prompt flow is one way to create a flow.
- A prompt is a package of input sent to a model, consisting of the user input, system message, and any examples. User input is text submitted in the chat window. System message is a set of instructions to the model scoping its behaviors and functionality.
- A sample flow is a simple, prebuilt orchestration flow that shows how flows work, and can be customized.
- A sample prompt is a defined prompt for a specific scenario that can be copied from a library and used as-is or modified in prompt design.
You can start building flow with the three predefined types (standard flow, chat flow or evaluation flow) or clone the prepared flows and deploy it based on your needs.
Web Classification
Web classification flow is an example available in the prompt flow gallery, for you to clone and experiment or deploy it to your application. It uses LLM to classify URL addresses into multiple categories.
And the graphical presentation of the flow:
Tomorrow we will look into building prompt flow in VS Code using Python.
All of the code samples will be available on my Github.
Thanks for visiting r-craft.org
This article is originally published at https://tomaztsql.wordpress.com
Please visit source website for post related comments.