This week you'll learn about Neural Machine Translation with Luong's Attention Using TensorFlow and Keras.

Image

In my childhood days, I was really a big fan of racing games. Be it cars or bikes, but the rush of driving cars at breakneck speed (in a safe virtual world, and please follow traffic rules in the real world) made me exuberant. 

As I was growing up, the complexities of these games slowly started to unravel. For example, I used to go to school wondering if I should choose a car with a better top speed or another vehicle with better acceleration. A good example of a school-life crisis! 

The objective of using the cars was the same; to achieve first place. So really, it was up to you to decide whether you want to accelerate faster as a short-term boost or have a higher top speed as a long-term boost. Your objective was to reach first. 

Now, we just finished learning about Bahdanau's Attention last week. So, why are we learning another variant of Attention (i.e., Luong's)? 

The answer is analogous to my example above; both achieve neural machine translation (NMT), but each has its own specialty, prompting you to choose between these two at your full liberty. 

The big picture: Luong's Attention is another variant of NMT, which differs from Bahdanau's Attention on some key points, establishing its individuality. 

How it works: Bahdanau's Attention uses a neural network to establish a relationship between the query and the key. Luong's Attention, also known as multiplicative attention, achieves this with a simple dot product to develop a relationship between the query and the key. 

Our thoughts: Our findings have shown particular improvement in results when we use Luong's Attention. 

Yes, but: That might not always mean the same for all datasets. 

Stay smart: Experiment with the two types of attention on various datasets to solidify your understanding of the two concepts. 

Click here to read the full tutorial

Solve Your CV/DL Problems This Week (or weekend) with Our Working Code

You can instantly access all of the code for Neural Machine Translation with Luong's Attention Using TensorFlow and Keras by joining PyImageSearch University. Get working code to

  1. Finish your project this weekend with our code
  2. Solve your thorniest coding problems at work this week to show off your expertise
  3. Publish groundbreaking research without multiple tries at coding the hard parts

Guaranteed Results: If you haven't accomplished your CV/DL goals, let us know within 30 days and get a full refund.

Yes, I want the code

Note: You may have missed this, but last Wednesday, we published a new post on Computer Vision and Deep Learning for Electricity.  Do you need some help with your industrial application of AI? Learn more about our consulting services.


A PyImageSearch Team Member

LIKESHARE, and