Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with.
This library wraps the the InvokeModel operation of the official .NET SDK for Amazon Bedrock and exposes actions for each model type to ODC for easy usage.
The library exposes the following actions
Invoke an Anthropic Claude 3 Text Model
Input parameters
credentials
region
modelId
request
Result
response
Invoke an Amazon Titan Text Model
Invoke a Mistral Instruct Model
Invoke a Cohere Command Model
Invoke a Meta Llama Model
Convert text to embeddings using Amazon Titan Embeddings model
Convert text to embeddings using Cohere Embed
Create Images with Stability Diffusion SDXL 1.0
You can use the Amazon Bedrock Converse API to create conversational applications that send and receive messages to and from an Amazon Bedrock model. For example, you can create a chat bot that maintains a conversation over many turns and uses a persona or tone customization that is unique to your needs, such as a helpful technical support assistant
Source code of connector library stefan-d-p/odc-awsbedrock-library: OutSystems Developer Cloud External Logic library for Amazon Bedrock Runtime SDK (github.com)