Inside Linkedin’s AI review: LLM Distillation Work Search


Join the event entrusted by business leaders for almost two decades. VB Transf transform gathers the people who build a real strategy of AI. Get more information


The arrival of natural languages ​​search has encouraged people to change the way to search for information and Linkedinthat has been Working with numerous AI models During the last year, he hopes that this change will be extended to job search.

Searching for Linkedin’s jobs, now available for all Linkedin users, uses distilled and adjusted models to the knowledge base of the professional social media platform for close employment opportunities based on natural language.

“This new search experience allows members to describe their goals in their own words and achieve results that really reflect what they are looking for,” said Erran Berger, Vice President of Linkedin Product Development, told Ventubebat in an email. “This is the first step on a bigger trip to make job search more intuitive, inclusive and empowering for everyone.”

Linkedin previously indicated in a block position That a significant problem that users faced when they were looking for jobs on the platform was an excessive confidence in precise keyword consultations. Users would often write a more generic title of work and would get positions that do not exactly match. From personal experience, if I write “Reporter” in Linkedin, I get search results for reporters in media publications, along with the openings of judicial reporters, which are a set of completely different skills.

Linkedin Vice President for Engineering Wenjing Zhang told Ventubebeat in a separate interview that he saw the need to improve how people could find jobs that adapt perfectly and started with a better understanding than they were looking for.

“So in the past, when we use keywords, we are essentially looking for a keyword and try to find the exact match. And sometimes in the description of the work, the description of the work can say reporter, but they are not really a reporter; we still recover this information, which is not ideal for the candidate,” said Zhang.

Linkedin has improved the understanding of user consultations and now allows people to use more than keywords. Instead of looking for “software engineer”, they may ask, “find software engineering jobs in Silicon Valley that were recently published.”

As they built it

One of the first things that Linkedin had to do was review the ability to understand his search function.

“The first stage is when you write a query, we need to be able to understand the consultation, and the next step is that you need to recover the appropriate type of information in our work library. And the last step is now that you have a couple of one hundred final candidates, as you can do the classification so that the most relevant work appears on the top,” said Zhang.

Linkedin was based on fixed methods, taxonomy, classification models and older LLM, which said that “he did not have the ability to understand deep semantic understanding.” The company then went to larger, already adjusted large language models (LLMS) to help improve the natural language processing capabilities (NLP) on its platform.

But LLMs also include expensive calculation costs. Linkedin therefore became distillation methods to reduce the cost of using GPU faces. They divide the LLM into two steps: one to work on the recovery of data and information and the other to classify the results. Using a teacher model to classify consultation and work, Linkedin said he was able to align both recovery and classification models.

The method also allowed Linkedin’s engineers to reduce the stages used by their work search system. At one point, “there were nine different stages that composed the pipeline to search and coincide with a job,” which were often duplicated.

“ To do this, we use a common technique of multiplying optimization. To ensure that recovery and classification are aligned, it is important for recovery to occupy documents that use the same MOO that uses the classification stage. The aim is to maintain the simple recovery, but without introducing an unnecessary burden on the productivity of the AI ​​developer, ” said Linkedin.

Linkedin also developed a consultation engine that generates personalized suggestions to users.

Linkedin is not alone to see the potential LLM -based business search. Google affirms it 2025 will be the year When business search becomes more powerful, thanks to advanced models.

Models as CoherteThe Rererank 3.5 Helps break the silos of language Within companies. The various “Deep research” products since Openai, Google and Anthropic Indicate an increasing organizational demand for agents accessing and analyzing internal data sources.

Linkedin has been publishing several functions based on last year. In October he threw a AI assistant to help recruiters Find the best candidates.

Deepak Agarwal, Linkedin Officer, Deepak Agarwalduring Transform VB to San Francisco this month. Sign up now to attend.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *