Introducing Amazon Nova Premier, our most capable model and best teacher model for Amazon Bedrock Model Distillation. It enables customers to create highly-capable, cost-effective, and low-latency custom distilled models for specific needs. Amazon Nova Premier excels at complex tasks like Retrieval-Augmented Generation (RAG), function calling, and agentic coding. With a one-million-token context window, it enables analysis of bigger datasets including large codebases, 400+ page documents, and 90-minute videos. Amazon Nova Premier is also the most cost-effective proprietary model in its intelligence tier in Amazon Bedrock today. And this is just the beginning – stay tuned as we continue advancing the Nova family.
Amazon Science
Research Services
Seattle, Washington 373,895 followers
The latest news and research from Amazon’s science community. #AmazonScience
About us
Amazon Science gives you insight into the company’s approach to customer-obsessed scientific innovation. Amazon fundamentally believes that scientific innovation is essential to being the most customer-centric company in the world. It’s the company’s ability to have an impact at scale that allows us to attract some of the brightest minds in artificial intelligence and related fields. Our scientists continue to publish, teach, and engage with the academic community, in addition to utilizing our working backwards method to enrich the way we live and work. Follow us on LinkedIn and visit our website to get a deep dive on innovation at Amazon, and explore the many ways you can engage with our scientific community. #AmazonScience
- Website
-
https://www.amazon.science
External link for Amazon Science
- Industry
- Research Services
- Company size
- 10,001+ employees
- Headquarters
- Seattle, Washington
- Founded
- 2020
- Specialties
- Artificial Intelligence, Machine Learning, Computer Vision, Cloud, Economics, Sustainability, AI, ML, Conversational AI, Natural Language Processing, NLP, Robotics, Security, Privacy, Information, Knowledge Management, Operations, Scientific Research, Search, Amazon, and Alexa
Updates
-
Reinforcement learning can align language models to human preferences but risks learning irrelevant correlations, like favoring longer responses. At ICLR, Amazon researchers presented a way to address this by generating contrasting pairs of training examples and keeping only the ones with very high contrast: https://amzn.to/436EJui
-
-
We've made it even easier to explore Amazon Nova foundation models. Starting today, nova.amazon.com no longer requires a login. Experience our understanding and creative generation capabilities right away.
-
-
This week, we're at ICLR sharing the latest research in deep learning from Amazon scientists. Our team is showcasing work spanning LLMs, knowledge distillation, and graph neural networks. Explore our accepted papers: https://amzn.to/3Ru5Eur #ICLR2025
-
-
New on nova.amazon.com: Amazon Nova Sonic, our speech-to-speech foundation model. Get started using your voice to engage in real-time conversations.
-
-
The deadline is approaching for the Amazon Research Awards spring 2025 call for proposals. Apply by April 30: https://amzn.to/3YBCKfM
-
-
Amazon Q Developer in SageMaker Canvas is a new generative-AI-powered assistant that lets customers build and deploy ML models in minutes using only natural language — no ML expertise required: https://amzn.to/43HELdT
-