Knowledge Bases for Amazon Bedrock now supports additional data connectors (in preview)

Enhance company knowledge bases with new Amazon Bedrock connectors for Confluence, Salesforce, SharePoint, and web domains – empowering RAG models with contextual data for more accurate and relevant responses.

Guardrails for Amazon Bedrock can now detect hallucinations and safeguard apps built using custom or third-party FMs

Guardrails for Amazon Bedrock adds hallucination detection and an independent API to fortify generative AI applications with customized guardrails across any model, ensuring responsible and trustworthy outputs.

Agents for Amazon Bedrock now support memory retention and code interpretation (preview)

Agents for Amazon Bedrock now offer Memory to retain user context and Code Interpreter to dynamically execute code snippets—whether for data analysis, visualization, or complex problem-solving.

Customize Amazon Q Developer (in your IDE) with your private code base

Unlock hyper-relevant code suggestions tailored to your codebase; enhance productivity with Amazon Q Developer’s private customization while maintaining robust data privacy and security standards.

Amazon Q Apps, now generally available, enables users to build their own generative AI apps

Craft generative AI apps from conversations using natural language and approved data sources. Customize securely shared apps, specifying data sources per card and new APIs for programmatic app management.

Build enterprise-grade applications with natural language using AWS App Studio (preview)

Effortlessly build apps with AI-powered low-code tools, enabling organizations to create secure custom apps in minutes without dev teams – streamlining processes like claims, inventory, and approvals.

Vector search for Amazon MemoryDB is now generally available

Store, index, retrieve, and search vectors with in-memory performance for use cases like retrieval augmentation, semantic caching, and anomaly detection through single-digit millisecond queries.

Announcing Llama 3.1 405B, 70B, and 8B models from Meta in Amazon Bedrock

The Llama 3.1 models are a collection of 8B, 70B, and 405B parameter size multilingual models that demonstrate state-of-the-art performance on a wide range of industry benchmarks, offering new capabilities for your generative AI applications.

Looking back on FY24: from Copilots empowering human achievement to leading AI Transformation

A year ago in July, Microsoft coined the term AI Transformation. It is almost hard to imagine at that time Copilots were not generally available and Azure OpenAI Service had only been available for six months. As Satya Nadella, our Chairman and CEO, stated in last quarter’s earnings: 60% of the Fortune 500 have adopted…

The post Looking back on FY24: from Copilots empowering human achievement to leading AI Transformation appeared first on The Official Microsoft Blog.

Prioritizing security above all else

Microsoft runs on trust, and our success depends on earning and maintaining it. We have a unique opportunity and responsibility to build the most secure and trusted platform that the world innovates upon.

The post Prioritizing security above all else appeared first on The Official Microsoft Blog.