See More

‰HDF ÿÿÿÿÿÿÿÿ˜Ê/ ÿÿÿÿÿÿÿÿ ` ˆ ¨ TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ HEAP X È transformer_encoder 8 ˆ ¨ P layer_names transformer_encoder @ backend H keras_version H ò TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ HEAP X ¨ stack_13 @ è ( HEAP X H transformer @ P p 0 P GCOL tensorflow 2.2.4-tf ¸ SNOD @ h ˆ h ˆ Xò weight_names Õ # # stack_13/transformer/transformer_encoder/transformer_input_embedding/embedding/embeddings:0 stack_13/transformer/transformer_encoder/transformer_input_embedding/position_embedding/dense/kernel:0 stack_13/transformer/transformer_encoder/transformer_input_embedding/position_embedding/dense/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/layer_norm/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/layer_norm/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/weight_norm_dense_24/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/weight_norm_dense_24/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/layer_norm_25/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/layer_norm_25/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/weight_norm_dense_25/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/weight_norm_dense_25/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/layer_norm_26/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/layer_norm_26/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/weight_norm_dense_26/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/weight_norm_dense_26/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/layer_norm_27/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/attention_qkv_projection/layer_norm_27/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/weight_norm_dense_27/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_self_attention/self_attention/multi_head_attention/weight_norm_dense_27/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/layer_norm_1/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/layer_norm_1/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/stack/weight_norm_dense/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/stack/weight_norm_dense/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/stack/weight_norm_dense/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/stack/weight_norm_dense_1/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/stack/weight_norm_dense_1/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block/transformer_feed_forward/stack/weight_norm_dense_1/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/layer_norm_2/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/layer_norm_2/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/weight_norm_dense_28/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/weight_norm_dense_28/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/layer_norm_28/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/layer_norm_28/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/weight_norm_dense_29/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/weight_norm_dense_29/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/layer_norm_29/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/layer_norm_29/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/weight_norm_dense_30/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/weight_norm_dense_30/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/layer_norm_30/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/attention_qkv_projection_1/layer_norm_30/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/weight_norm_dense_31/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_self_attention_1/self_attention_1/multi_head_attention_1/weight_norm_dense_31/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/layer_norm_3/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/layer_norm_3/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/stack_1/weight_norm_dense_2/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/stack_1/weight_norm_dense_2/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/stack_1/weight_norm_dense_2/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/stack_1/weight_norm_dense_3/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/stack_1/weight_norm_dense_3/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_1/transformer_feed_forward_1/stack_1/weight_norm_dense_3/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/layer_norm_4/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/layer_norm_4/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/weight_norm_dense_32/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/weight_norm_dense_32/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/layer_norm_31/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/layer_norm_31/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/weight_norm_dense_33/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/weight_norm_dense_33/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/layer_norm_32/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/layer_norm_32/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/weight_norm_dense_34/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/weight_norm_dense_34/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/layer_norm_33/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/attention_qkv_projection_2/layer_norm_33/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/weight_norm_dense_35/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_self_attention_2/self_attention_2/multi_head_attention_2/weight_norm_dense_35/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/layer_norm_5/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/layer_norm_5/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/stack_2/weight_norm_dense_4/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/stack_2/weight_norm_dense_4/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/stack_2/weight_norm_dense_4/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/stack_2/weight_norm_dense_5/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/stack_2/weight_norm_dense_5/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_2/transformer_feed_forward_2/stack_2/weight_norm_dense_5/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/layer_norm_6/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/layer_norm_6/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/weight_norm_dense_36/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/weight_norm_dense_36/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/layer_norm_34/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/layer_norm_34/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/weight_norm_dense_37/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/weight_norm_dense_37/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/layer_norm_35/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/layer_norm_35/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/weight_norm_dense_38/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/weight_norm_dense_38/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/layer_norm_36/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/attention_qkv_projection_3/layer_norm_36/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/weight_norm_dense_39/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_self_attention_3/self_attention_3/multi_head_attention_3/weight_norm_dense_39/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/layer_norm_7/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/layer_norm_7/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/stack_3/weight_norm_dense_6/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/stack_3/weight_norm_dense_6/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/stack_3/weight_norm_dense_6/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/stack_3/weight_norm_dense_7/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/stack_3/weight_norm_dense_7/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_3/transformer_feed_forward_3/stack_3/weight_norm_dense_7/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/layer_norm_8/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/layer_norm_8/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/weight_norm_dense_40/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/weight_norm_dense_40/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/layer_norm_37/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/layer_norm_37/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/weight_norm_dense_41/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/weight_norm_dense_41/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/layer_norm_38/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/layer_norm_38/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/weight_norm_dense_42/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/weight_norm_dense_42/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/layer_norm_39/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/attention_qkv_projection_4/layer_norm_39/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/weight_norm_dense_43/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_self_attention_4/self_attention_4/multi_head_attention_4/weight_norm_dense_43/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/layer_norm_9/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/layer_norm_9/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/stack_4/weight_norm_dense_8/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/stack_4/weight_norm_dense_8/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/stack_4/weight_norm_dense_8/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/stack_4/weight_norm_dense_9/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/stack_4/weight_norm_dense_9/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_4/transformer_feed_forward_4/stack_4/weight_norm_dense_9/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/layer_norm_10/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/layer_norm_10/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/weight_norm_dense_44/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/weight_norm_dense_44/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/layer_norm_40/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/layer_norm_40/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/weight_norm_dense_45/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/weight_norm_dense_45/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/layer_norm_41/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/layer_norm_41/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/weight_norm_dense_46/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/weight_norm_dense_46/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/layer_norm_42/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/attention_qkv_projection_5/layer_norm_42/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/weight_norm_dense_47/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_self_attention_5/self_attention_5/multi_head_attention_5/weight_norm_dense_47/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/layer_norm_11/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/layer_norm_11/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/stack_5/weight_norm_dense_10/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/stack_5/weight_norm_dense_10/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/stack_5/weight_norm_dense_10/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/stack_5/weight_norm_dense_11/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/stack_5/weight_norm_dense_11/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_5/transformer_feed_forward_5/stack_5/weight_norm_dense_11/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/layer_norm_12/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/layer_norm_12/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/weight_norm_dense_48/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/weight_norm_dense_48/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/layer_norm_43/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/layer_norm_43/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/weight_norm_dense_49/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/weight_norm_dense_49/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/layer_norm_44/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/layer_norm_44/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/weight_norm_dense_50/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/weight_norm_dense_50/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/layer_norm_45/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/attention_qkv_projection_6/layer_norm_45/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/weight_norm_dense_51/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_self_attention_6/self_attention_6/multi_head_attention_6/weight_norm_dense_51/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/layer_norm_13/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/layer_norm_13/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/stack_6/weight_norm_dense_12/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/stack_6/weight_norm_dense_12/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/stack_6/weight_norm_dense_12/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/stack_6/weight_norm_dense_13/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/stack_6/weight_norm_dense_13/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_6/transformer_feed_forward_6/stack_6/weight_norm_dense_13/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/layer_norm_14/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/layer_norm_14/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/weight_norm_dense_52/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/weight_norm_dense_52/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/layer_norm_46/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/layer_norm_46/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/weight_norm_dense_53/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/weight_norm_dense_53/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/layer_norm_47/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/layer_norm_47/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/weight_norm_dense_54/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/weight_norm_dense_54/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/layer_norm_48/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/attention_qkv_projection_7/layer_norm_48/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/weight_norm_dense_55/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_self_attention_7/self_attention_7/multi_head_attention_7/weight_norm_dense_55/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/layer_norm_15/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/layer_norm_15/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/stack_7/weight_norm_dense_14/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/stack_7/weight_norm_dense_14/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/stack_7/weight_norm_dense_14/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/stack_7/weight_norm_dense_15/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/stack_7/weight_norm_dense_15/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_7/transformer_feed_forward_7/stack_7/weight_norm_dense_15/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/layer_norm_16/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/layer_norm_16/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/weight_norm_dense_56/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/weight_norm_dense_56/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/layer_norm_49/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/layer_norm_49/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/weight_norm_dense_57/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/weight_norm_dense_57/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/layer_norm_50/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/layer_norm_50/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/weight_norm_dense_58/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/weight_norm_dense_58/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/layer_norm_51/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/attention_qkv_projection_8/layer_norm_51/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/weight_norm_dense_59/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_self_attention_8/self_attention_8/multi_head_attention_8/weight_norm_dense_59/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/layer_norm_17/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/layer_norm_17/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/stack_8/weight_norm_dense_16/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/stack_8/weight_norm_dense_16/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/stack_8/weight_norm_dense_16/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/stack_8/weight_norm_dense_17/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/stack_8/weight_norm_dense_17/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_8/transformer_feed_forward_8/stack_8/weight_norm_dense_17/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/layer_norm_18/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/layer_norm_18/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/weight_norm_dense_60/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/weight_norm_dense_60/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/layer_norm_52/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/layer_norm_52/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/weight_norm_dense_61/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/weight_norm_dense_61/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/layer_norm_53/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/layer_norm_53/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/weight_norm_dense_62/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/weight_norm_dense_62/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/layer_norm_54/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/attention_qkv_projection_9/layer_norm_54/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/weight_norm_dense_63/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_self_attention_9/self_attention_9/multi_head_attention_9/weight_norm_dense_63/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/layer_norm_19/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/layer_norm_19/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/stack_9/weight_norm_dense_18/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/stack_9/weight_norm_dense_18/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/stack_9/weight_norm_dense_18/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/stack_9/weight_norm_dense_19/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/stack_9/weight_norm_dense_19/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_9/transformer_feed_forward_9/stack_9/weight_norm_dense_19/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/layer_norm_20/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/layer_norm_20/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/weight_norm_dense_64/kernel:0stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/weight_norm_dense_64/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/layer_norm_55/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/layer_norm_55/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/weight_norm_dense_65/kernel:0stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/weight_norm_dense_65/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/layer_norm_56/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/layer_norm_56/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/weight_norm_dense_66/kernel:0stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/weight_norm_dense_66/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/layer_norm_57/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/attention_qkv_projection_10/layer_norm_57/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/weight_norm_dense_67/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_self_attention_10/self_attention_10/multi_head_attention_10/weight_norm_dense_67/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/layer_norm_21/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/layer_norm_21/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/stack_10/weight_norm_dense_20/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/stack_10/weight_norm_dense_20/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/stack_10/weight_norm_dense_20/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/stack_10/weight_norm_dense_21/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/stack_10/weight_norm_dense_21/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_10/transformer_feed_forward_10/stack_10/weight_norm_dense_21/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/layer_norm_22/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/layer_norm_22/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/weight_norm_dense_68/kernel:0stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/weight_norm_dense_68/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/layer_norm_58/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/layer_norm_58/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/weight_norm_dense_69/kernel:0stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/weight_norm_dense_69/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/layer_norm_59/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/layer_norm_59/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/weight_norm_dense_70/kernel:0stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/weight_norm_dense_70/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/layer_norm_60/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/attention_qkv_projection_11/layer_norm_60/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/weight_norm_dense_71/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_self_attention_11/self_attention_11/multi_head_attention_11/weight_norm_dense_71/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/layer_norm_23/gamma:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/layer_norm_23/beta:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/stack_11/weight_norm_dense_22/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/stack_11/weight_norm_dense_22/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/stack_11/weight_norm_dense_22/g:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/stack_11/weight_norm_dense_23/kernel:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/stack_11/weight_norm_dense_23/bias:0 stack_13/transformer/transformer_encoder/encoder_stack/transformer_encoder_block_11/transformer_feed_forward_11/stack_11/weight_norm_dense_23/g:0 (  h! è »Ñ\ h TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ è SNOD è ( TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ È HEAP X  transformer_encoder 8 SNOD   P p TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ Ð HEAP X 8 p transformer_input_embedding encoder_stack SNOD È 0 P 8 X TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ Ø HEAP X 0 x embedding position_embedding ( SNOD ( ¨" Ð" ð" 8 X @ ` TREE ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ HEAP X € embeddings:0 @ SNOD @ ` x   À SNOD Ø Êß>qT½Óu>ŒfWˆ:>f¤ß>Í]ä>‰§€¾ÏB’¾·N¢¾Ï¬=o€>=âc¾mÈØ>4Ïè¾uùÝ>¢rz¼ùG‰=|š¾Ùãž=¨w‡½Agÿ¾Ð®Ý<Ó =8¢>½ùÔ=°`ª½Àؾ¾þéP¾ÒDŽ>É”Œ¾¾fp¾ö޾¡­­>¾¦Æ¾ˆ¶¾( =½ ×=],ã>ù:о#/Œ> è·{·>kªT¾fºÒ>»Š;–a=i«¾ýõ½:ôÒ;ªñ¨¾xL1¾Ñ|§¾Å †>Ë­½o™k>©Ä¾ðXr>øÊ>ø¾“>®¸Þ>0¹<>ÛD–>V÷‡>¹:½·>ò;>Ȉ¾{¾bŠê¾³À‘½©Š>+Ý?»hཾ«î¾øÁ>“@ =HÚ?ú*>äL&>;2¿…AŽ>©‡Ã>³M ?d;Ó‡õ>”E=>å?<”ýf>çóÂ>É)h¾/*h>c†¾©y½½$>ŽE"¾R¿»½Ì«ô==åÍ>o1½ÿ®Ä¼‰&º¾±µ >µµŒ>D¾E‰M>Ÿ\Ÿ>Ÿ¡§¾iõ=:ľócr>ñò9¾ž'»%]À>qò½”žÁ¾è°ƒ¾ÚÙ> Â>>üa´<ˆ>Z•=ç¤À>Û|ɽê°j¾“3=F¿ ?˜XÀ¾ÌK>`:©¾mz?¾})J½WùǾ•éh¾?þܾ‡òï>ݸ¾äP­>Éy×=Ð]K=ß)%½4Ò¾’””¾³~¾ý Öd×>¨Ü>*ôs¾4ñ>íŽ0<Ôû;©g¶>^y3¾J?v¤=?Œ½Ôî†>f¨>Ïï>ÅŽ½æ½m¾Ÿã=Bé<¾/¥»æ@#¾» ¾Sɾ‚´>U¾1¸Ô¾÷⾁‡?¾`ŒF="WŠ<“W >¼(>®Ï޾ïû’¾N†> ë»>:k•¾˜ÑŒ¾mø›>d÷é>Ÿ«Ö¾f‡n¾¹ð¾½>õ#¾_…é>S€¾Ÿ W>a†>ÉJT¾­Œ‰>©Ë+=çSÝ>Ñd–>—$>6‘½ƒôì>fÒ«½óμ>Ú J>=õ¡>Nу¾ó¾[ÉÆ<ŸK>þؽ›€>Aê)>Þ€8¾¼>µ±Â½kÑ>ÖÉ×¼’˜:«?h­ñ>ž-½¾Jᅳ¨½„A">&%Ñ>Ä{¾'u¾`f½‡íÒ>&‹}¾=Þ‹>±¾ÏU¨¾=`)>µV>18º½øjs¾Š³¾‡÷³¼—¨È>ÀŸ!¾î…¾×ȾK¾Z¸Ê>.{±=PBW½Fˆ>%ÿ=½X…†>Ërš>ËÉ—¾Å툾šäÚ½oÈç¾ ðÈ<„ön½í³¾ãd>RL0½D]¹¾¸‡‘>K¾Ä2€>B^”¾„«?¾,U¾$¼>‚U½,¡¾âR¾Ò]¾·‚Ö=Rœ¾‹c><o>†¶9¾=Žà¾‘ß™>ÛÙÂ>š·r=pJ¾è9Ó>ü7á¾€‘¾é¨,>·-=>//ï½”ÄB>¶=ª>„»>IȾÖiA½åh»½ T¾ƒ0½t㽞Œ¾_< >§7‘¾¤c=¾¾”¾Yr >¥­¾•Ò”>^瑾võ£>rB~>Ù¥l>*ìE>TEܾv×> dÂ>m²>[|¾¢z¾ŠªO¾xS>Lׇ¾í›w½L앾±±è¾’A ¾”–>3ž>ß÷é¼Ûà ¾RÒL>Ä->dÆÅ¾æŒ¾JSù>:…¾5ï«>ï†ò=ðî3> á!>NTÔ>Ûv>±W"¾Î¾ÆQ™¾oyu¾$œ»>Žú¢½ù£¾Än¦>s§$½ëÔ=—/q>o-À>…˜¾¯ÐϾK7“>lu,=˜BC>k¹Ž>¬½pçã>íÈv¾’ ñx¾¶Å…½úH¿¨tn¾*ʉ>$¾Ð3M>G³Ï¼¼¶¾æŠF=Ž!¿¯4[>³x¾f²>Bf]¾p{º= 3„¾3Œ½\J¿> d×>‹—>9¾© ¹¾\`ã>XÍ>;ò¾"ä˾?-¡¾Iô¾Uˍ¾µ¾È>#Mf>Ì`Œ¾ôÙ>cÖP¾™¾U¬¼#¾¡Â8>]é²¾ÑÁ>Å롾(¾@$¾ q›>P¾£ªÄ½I¹Å>ÙÃs>Þæ“½Fö¾–˜¾HÖ¾N¾Þ)?¾{éC>DĐ¾@3޾ƒ¥à¾+õ‰>s·±¾ûõä½ó·¾hJ€>öDc>Áp¤>YŸz=`|—¾Y€¼„>¨À>%Š®¾{þP¾Á°¿qÅB¾ö¨>¯1½õtþ=FZ¸¾/Ⱦ•#;;¢hP=λz<¶¾Œõ¾y˜å¾PO9>gå"¾YAŽ>wæ½›j¾à/Æ>ζ†¾8Õ€¾ '¾…¥>xŽ>Ã>Í$>N4ؽr¼4>÷‘—>=[i>Eá¶>Ÿ–&¾‡ãP¾ D¾”ŠÙ>Æ>>1$‹¾ÝYÜ=4/;¾XžÜ.˾ëe²:N¾EÚ’¾Üé½¾9Ì=ñ˜¶¾¶¯¦¾ÞZŸ¾g‚–<%Ð>K\ß=tX¾á¿“½Á9H½ÞíZ¾cŒB½ÙlS>ÜÙ¾ãè$>d3ç¾/M<Ȱ¾êú<¾,{¡>Á¾5¾ËE¾°Í>Ë@i>[@<³4=Åz󺡆ÍÍðâ½\õe9Ü5¼qg¥»³Z <£Ð«=dï <û<³¥S¾Ð W¼tç¿:@º)¾a;ï³€ºÜ·½Çù<Òÿ1½ø[=“š+=±î¼5df;œ_a½L®ä<ƒÊ<ë»ç¼âe½Ÿg:¾aÄ<ƒ¹¼ÍºŽ