Announcing Marqtune, the Embedding Model Training Platform

The Embeddings Cloud

Train, generate and retrieve vector embeddings in one platform.
Build powerful AI applications and transform your retrieval stack with Marqo Cloud.

How are you managing embeddings?

Model Training
Embedding models can perform worse than traditional BM25 retrieval without fine-tuning on domain-specific data.
Model Serving
Building scalable and efficient pipelines to generate embeddings for ingestion and search requires dedicated infrastructure.
Retrieval
Embedding retrieval needs to be augmented with other techniques, such as reranking, hybrid search and filtering.
Evaluation
Comprehensive and consistent evaluation of retrieval performance is needed to increase iteration speed and improve results in production.

The Marqo Stack

Train

Train hundreds of open source embedding models, bring your own or train with Marqo's Generalized Contrastive Learning.

Embed

Marqo’s community-backed embedding inference engine provides blazingly fast image ingestion, inference and search. Marqo supports hundreds of embedding models out of the box, as well as custom weights.

Retrieve

Marqo provides horizontally scalable vector and lexical retrieval with metadata handling, reranking, filtering, score modification and hybrid search.

Evaluate

Evaluates your search system with grades for precision, ranking, consistency, and average relevance. Uses multimodal LLMs to analyze relevance of each returned result for a standardized set of queries.

Marqo research and proprietary models

Build transformative search and information retrieval experiences powered by industry-leading models and tools.

Generalized Contrastive Learning

GCL improves search result relevance, with a 94% increase in NDCG@10 and 504% for ERR@10 for in-domain.
Learn More

Understanding Recall in HNSW Search

Dive into Marqo's benchmarks to learn more about HNSW recall and how it is affected by intrinsic dimensionality.
Learn More

How you can use Marqo

Search

Use advanced semantic search to improve relevance and optimize results for business outcomes.
Learn More

RAG & Generative AI

Improve the performance of generative AI applications with state-of-the-art embeddings, fine tuning, and multi-modal capabilities.
Learn More

Recommendations & personalisation

Recommend the best products for your customers based on their search, browse, and other activities and characteristics, to drive more purchases.
Learn More

Content moderation

Find similar entities, whether text ("red shoe" and "red shoe.") or images, in order to clean up product catalogs, social feeds, and more. Eliminate NSFW content through image-based semantic search.
Learn More

Trusted by

Marqo is instrumental
Vector search has become a must have component of generative AI. Marqo is instrumental in making AI useful to businesses by enabling developers to use the best technology with little effort.
Aidan Gomez
LinkedIn Icon
CEO and co-founder of Cohere
See results instantly
With Marqo, we were able to deploy advanced vector search quickly and easily and see results instantly. We went from sign-up to production A/B testing in five days and, within the next week, had rolled out a new feature to 100% of our traffic after we saw an improvement in key metrics.
Anthony Ziebell
LinkedIn Icon
Head of AI at Temple & Webster

Why Marqo

End-to-end

Marqo has you covered from training, to inference, to storage. With Marqo you don't need to calculate the vectors yourself, simply select the model you want to use and pass the text and/or image URLs directly to the API.

Multimodal

Marqo helps you configure mutlimodal models like CLIP to pull semantic meaning from images and other data types. You can seamlessly search any combination of text and images and even combine text and images into a single vector

Multilingual

Search in over 100 languages. Marqo provides access to state of the art multilingual models. Expand your search to new localities with zero manual language configuration changes.

Scalable

Run Marqo in a docker image on your laptop or scale it up to dozens of GPU inference nodes in the cloud. Marqo can be scaled to provide low latency searches against multi-terabyte indexes.

Request a demo

We’d love to speak with you. Send us your questions about Marqo and we’ll set up a time to meet with you.

Book Demo