International Journal of All Research Education & Scientific Methods

An ISO Certified Peer-Reviewed Journal

ISSN: 2455-6211

Latest News

Visitor Counter
5014297953

Movie Recommendation Using BERT, General Purp...

You Are Here :
> > > >
Movie Recommendation Using BERT, General Purp...

Movie Recommendation Using BERT, General Purpose Text Feature Extractor and Nearest Neighbour Search

Author Name : Rohan binaykia, Rajib Lochan Das, Kaushal Attaluri, Vaishnavi Tiwari, Vinay Chowdary Vemuri, Venkata Sai Sri Ram Chunduri

ABSTRACT

BERT is a language representation model which stands for Bidirectional Encoder Representations from Transformers. Typical language models use only the left context of the sentence is taken into account. That led to a lot of lost context which could have been otherwise used.

So one trick would be to also train a model one right-to-left, and you can concatenate them together for the downstream task.

In contrast, BERT is designed to pre-train deeply bidirectional representations by jointly conditioning on both left and right context in all layers. This method has already shown empirical results which show higher performance of this model versus other language models in performing key NLP tasks.

BERT can be applied to various business applications such as chatbots, customer reviews, and search engines. While some of these applications have been thoroughly tried and tested, and observations have been well documented, others applications are yet to be explored to the fullest extent.

What we chose to delve deeper into was how BERT would impact search results, and that’s what we decided our project was going to be about.

In this experiment, we use a pre-trained BERT model checkpoint to build a general- purpose text feature extractor, which we apply to the task of nearest neighbor search.