Member of Technical Staff - Search
Posted on Sunday, October 15, 2023
Who are we?
Cohere is focused on building and deploying large language model (LLM) AI into enterprises in a safe and responsible way that drives human productivity, and creates magical new ways to interact with technology and real business value. We’re a team of highly motivated and experienced engineers, innovators, and disruptors looking to change the face of technology.
Our goals are ambitious, but also concrete and practical. Cohere wants to fundamentally change how businesses operate, making everyone more productive and able to focus on doing better what they do best. Every day, our team breaks new ground, as we build transformational AI technology and products for enterprise and developers to harness the power of LLMs.
Cohere was founded by three global leaders in AI development, including our CEO, Aidan Gomez, who co-created the Transformer, which makes LLMs possible. Collectively, we're driven by the belief that our technology has the potential to revolutionize the way enterprises, their employees, and customers engage with technology through language.
Cohere’s broader research team is world-renowned, having contributed to the development of sentence transformers for semantic search, dynamic adversarial data collection and red teaming, and retrieval augmented generation, often referred to as “RAG,” among other technological breakthroughs.
We have been deliberate in assembling a team of operational leaders with industry-leading experience, with backgrounds working at the most sophisticated, demanding, and respected enterprises in the world. Cohere’s operational leaders have built, scaled, and led multi-billion product lines and businesses at Google, Apple, Rakuten, YouTube, AWS, and Cisco.
The Cohere team is a collective from all walks of life, from people who left college to start businesses, to some of the most experienced people from globally renowned companies. We believe a diverse team is the key to a safer, more responsible technology, and that different experiences and backgrounds enable us to tackle problems from all angles and avoid blindspots.
There’s no better time to play a role in defining the future of AI, and its impact on the world.
Why this role?
We are looking for talented individuals to help us develop state-of-the-art models for information retrieval as part of our Search team. This group is working on a range of tasks including generating useful dense vectors and reranking. You'll have the opportunity to revolutionise people's search experience by contributing to building an efficient and precise search system and you would have a lot of opportunity to try new things out, innovate, and productionize your ideas.
Your work will specifically focus on advancing semantic search techniques to improve accuracy and efficiency, involving working with a wide range of technologies and collaborating with other teams to integrate your work into our search infrastructure.
We're looking for someone who is passionate about search and has a strong background in information retrieval. Candidates should have experience working with a wide range of technologies and have worked collaboratively with other teams in the past.
Please note: we have offices in Toronto, Palo Alto, San Francisco, and London but embrace being a remote-first company! There are no restrictions on where can you be located for this role.
As a Member of Technical Staff on this team, you will:
- Design, train and improve upon cutting-edge search models
- Gather high-quality retrieval datasets and optimize data pipelines for model training and evaluation.
- Work closely with model serving team to ensure that inference is fast and stable.
- Collaborate with product teams to develop solutions.
- Engage in research collaborations with our partner organizations and academic affiliations and publish your work in top-tier conferences and journals
- Join us at a pivotal moment, shape what we build, have a strong ownership mindset, and wear multiple hats!
You may be a good fit if you have:
- Proficiency in Python and related ML frameworks such as PyTorch, Tensorflow, TF-Serving, JAX, and XLA/MLIR
- Experience training representation models, or using text embeddings in downstream tasks
- Familiarity with various information retrieval techniques, such as lexical search and dense vector search
- Familiarity with autoregressive sequence models, such as Transformers
- Strong communication and problem-solving skills
- Proficiency in other programming languages, such as C++ or Golang
- Experience using large-scale distributed training strategies with GPUs
Bonus points for:
* This is neither an exhaustive nor necessary set of attributes. Even if none of these apply to you, but you believe you will contribute to Cohere, please reach out. We have a wide variety of backgrounds at Cohere.
If some of the above doesn’t line up perfectly with your experience, we still encourage you to apply! If you consider yourself a thoughtful worker, a lifelong learner, and a kind and playful team member, Cohere is the place for you.
We value and celebrate diversity and strive to create an inclusive work environment for all. We welcome applicants of all kinds and are committed to providing an equal opportunity process. Cohere provides accessibility accommodations during the recruitment process. Should you require any accommodation, please let us know and we will work with you to meet your needs.
🤝 An open and inclusive culture and work environment
🧑💻 Work closely with a team on the cutting edge of AI research
🍽 Free daily lunch
🦷 Full health and dental benefits, including a separate budget to take care of your mental health
🐣 100% Parental Leave top-up for 6 months for employees based in Canada, the US, and the UK
🎨 Personal enrichment benefits towards arts and culture, fitness and well-being, quality time, and workspace improvement
🏙 Remote-flexible, offices in Toronto, Palo Alto, San-Francisco and London and co-working stipend
✈️ 6 weeks of vacation
Note: This post is co-authored by both Cohere humans and Cohere technology.