Commit 5df85022 authored by Mohammad Shoeybi's avatar Mohammad Shoeybi
Browse files

Merge branch 'fixfp16gensamples' into 'master'

fixed a bug on fp16 while generating samples

See merge request ADLR/megatron-lm!29
parents a4cb4153 cfc6924b
Loading
Loading
Loading
Loading
+4 −0
Original line number Diff line number Diff line
@@ -98,6 +98,10 @@ def get_batch(context_tokens, args):
        args.reset_attention_mask,
        False)

    # Fp16 conversion.
    if args.fp16:
        attention_mask = attention_mask.half()

    return tokens, attention_mask, position_ids

def top_k_logits(logits, top_k=0, top_p=0.0, filter_value=-float('Inf')):