Advent of 2024, Day 3 – Microsoft Azure AI – Creating project in Azure AI Foundry
This article is originally published at https://tomaztsql.wordpress.com
In this Microsoft Azure AI series:
- Dec 01: Microsoft Azure AI – What is Foundry?
- Dec 02: Microsoft Azure AI – Working with Azure AI Foundry
In Azure AI Foundry yo u will be able to create project that will keep your solution together.
Select the “+ Create Project” and give a hub name.
And you will get all the Endpoints:
You will get the endpoints:
- Azure AI inference
- Azure OpenAI
- Azure AI Services
These API will be available for you to call the services when building the solution.
Azure AI inference API
Azure AI models inference service provides access to the most powerful models available in the Azure AI model catalog. Coming from the key model providers in the industry including OpenAI, Microsoft, Meta, Mistral, Cohere, G42, and AI21 Labs; these models can be integrated with software solutions to deliver a wide range of tasks including content generation, summarization, image understanding, semantic search, and code generation.
The Azure AI model inference service provides a way to consume models as APIs without hosting them on your infrastructure. Models are hosted in a Microsoft-managed infrastructure, which enables API-based access to the model provider’s model. API-based access can dramatically reduce the cost of accessing a model and simplify the provisioning experience.
Azure OpenAI
Managing and interacting with Azure OpenAI models and resources is divided across three primary API surfaces:
- Control plane ( Azure OpenAI shares a common control plane with all other Azure AI Services. The control plane API is used for things like creating Azure OpenAI resources, model deployment, and other higher level resource management tasks. The control plane also governs what is possible to do with capabilities like Azure Resource Manager, Bicep, Terraform, and Azure CLI.)
- Data plane – authoring (The data plane authoring API controls fine-tuning, file-uploading, ingestion jobs, batch and model leve queries)
- Data plane – inference (The data plane inference API provides the inference capabilities/endpoints for features like completions, chat completions, embeddings, speech/whisper, on your data, Dall-e, assistants, etc.)
Each API surface/specification encapsulates a different set of Azure OpenAI capabilities. Each API has its own unique set of preview and stable/generally available (GA) API releases. Preview releases currently tend to follow a monthly cadence.
Azure AI Services
Managing and interacting with Azure services like document intelligence, Computer and custom vision, Face API, Speech AI, Azure AI Search and many others.
All Endpoints
If you want to explore all the endpoints, you will need to deploy the service and connect it through Project.
Tomorrow we will start with deployment using Azure AI Foundry and check the details.
All of the code samples will be available on my Github.
Thanks for visiting r-craft.org
This article is originally published at https://tomaztsql.wordpress.com
Please visit source website for post related comments.