Pre-trained transformer models have significantly improved search results and yield much better results than traditional approach. With their good ability of text understanding, they can retrieve great results without relying on external ranking signals like PageRank or click-rates. This make these methods especially suitable for application where such information is missing.
In this talk, I will give an introduction into semantic search: How to create the necessary representations
How to find relevant documents
How to optimize and adapt algorithms to your own use-case without the need of labeled data
Lucky draw prizes
We will raffle 5 winners for the book during the event. To enter the lucky draw, please complete one of the two steps:
- Twitter the event with hashtag #nlpaicamp and tag @aicampai . for example:
#nlpaicamp online AI/ML tech talk series by @aicampai: @Nils_Reimers (Principal Scientist at @CohereAI) will discuss everything about semantic search with pre-trained transformers. Free RSVP: https://www.aicamp.ai
- Commend or reshare the post on linkedin: LinkedIn Post
* print copy for winners in US and e-book for winners outside US.
Nils Reimers is an expert on search relevance using pre-trained transformer network. In 2018, he authored and open-sourced the popular sentence-transformers library, which is with more than 20 Million downloads the most popular framework to design semantic search applications. He then joined HuggingFace to build the research team on Neural Search, with a special focus on adapting these methods to use-cases without labeled data. Recently, he joined cohere.ai as director of machine learning to develop new state-of-the-art neural search models and to make them broadly accessible as API endpoints