What is the difference between encoder and decoder blocks in the Transformer architecture?

Master LLM and Gen AI with 600+ Real Interview Questions

What is the difference between encoder and decoder blocks in the Transformer architecture?


A) Encoder blocks handle input sequence processing, while decoder blocks generate predictions token-by-token.
B) Encoder blocks use self-attention only, while decoder blocks use cross-attention only.
C) Encoder blocks normalize the embeddings, while decoder blocks handle token predictions without embeddings.
D) Encoder blocks are used for classification tasks, while decoder blocks are used only for translation tasks.

Correct Answer:
A) Encoder blocks handle input sequence processing, while decoder blocks generate predictions token-by-token.

Explanation:
Encoders process and transform the input sequence into a meaningful representation, while decoders use this representation (along with previously generated tokens) to generate outputs sequentially.

Master Python With Real Coding Interview Questions

Leave a Reply