4j.langchain4j-embeddings-all-minilm-l6-v2-q.0.34.0.source-code.all-minilm-l6-v2-q.onnx Maven / Gradle / Ivy
Go to download
Show more of this group Show more artifacts with this name
Show all versions of langchain4j-embeddings-all-minilm-l6-v2-q Show documentation
Show all versions of langchain4j-embeddings-all-minilm-l6-v2-q Show documentation
In-process all-minilm-l6-v2 (quantized) embedding model
onnx.quantize0.1.0:???
?/Constant_output_0 /Constant"Constant*
value*J ???
C/Constant_1_output_0/Constant_1"Constant*
value*J????
Y/embeddings/Constant_output_0/embeddings/Constant"Constant*
value*J ?
_/embeddings/Constant_1_output_0/embeddings/Constant_1"Constant*
value*J ?
_/embeddings/Constant_2_output_0/embeddings/Constant_2"Constant*
value*J ?
_/embeddings/Constant_3_output_0/embeddings/Constant_3"Constant*
value*J ?
i'/embeddings/LayerNorm/Constant_output_0/embeddings/LayerNorm/Constant"Constant*
value*J @?
m)/embeddings/LayerNorm/Constant_1_output_0 /embeddings/LayerNorm/Constant_1"Constant*
value*J̼?+?
?1/encoder/layer.0/attention/self/Constant_output_0(/encoder/layer.0/attention/self/Constant"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_1_output_0*/encoder/layer.0/attention/self/Constant_1"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_2_output_0*/encoder/layer.0/attention/self/Constant_2"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_3_output_0*/encoder/layer.0/attention/self/Constant_3"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_4_output_0*/encoder/layer.0/attention/self/Constant_4"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_5_output_0*/encoder/layer.0/attention/self/Constant_5"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_6_output_0*/encoder/layer.0/attention/self/Constant_6"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_7_output_0*/encoder/layer.0/attention/self/Constant_7"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_8_output_0*/encoder/layer.0/attention/self/Constant_8"Constant*
value*J ?
?3/encoder/layer.0/attention/self/Constant_9_output_0*/encoder/layer.0/attention/self/Constant_9"Constant*
value*J ?
?4/encoder/layer.0/attention/self/Constant_10_output_0+/encoder/layer.0/attention/self/Constant_10"Constant*
value*J ?
?4/encoder/layer.0/attention/self/Constant_11_output_0+/encoder/layer.0/attention/self/Constant_11"Constant*
value*J ?
?4/encoder/layer.0/attention/self/Constant_12_output_0+/encoder/layer.0/attention/self/Constant_12"Constant*
value*J??@?
?4/encoder/layer.0/attention/self/Constant_13_output_0+/encoder/layer.0/attention/self/Constant_13"Constant*
value*J ?
?4/encoder/layer.0/attention/self/Constant_14_output_0+/encoder/layer.0/attention/self/Constant_14"Constant*
value*J ?
?4/encoder/layer.0/attention/self/Constant_15_output_0+/encoder/layer.0/attention/self/Constant_15"Constant*
value*J? ?
?=/encoder/layer.0/attention/output/LayerNorm/Constant_output_04/encoder/layer.0/attention/output/LayerNorm/Constant"Constant*
value*J @?
??/encoder/layer.0/attention/output/LayerNorm/Constant_1_output_06/encoder/layer.0/attention/output/LayerNorm/Constant_1"Constant*
value*J̼?+?
?C/encoder/layer.0/intermediate/intermediate_act_fn/Constant_output_0:/encoder/layer.0/intermediate/intermediate_act_fn/Constant"Constant*
value*J????
?E/encoder/layer.0/intermediate/intermediate_act_fn/Constant_1_output_0