cancel
Showing results for 
Search instead for 
Did you mean: 

X-CUBE-AI failed to convert my model and reported the following error.

Nephalem
Associate II

I have provided my ONNX model, which has been tested.

INTERNAL ERROR: H not found in shape with shape map (BATCH, CH, W)

1 ACCEPTED SOLUTION

Accepted Solutions

Hello @Nephalem,

 

Your issue here comes from Einsum layer.

In your case, the operation that does not work is einsum("bhqk,bkhd->bqhd")

 

JulianE_0-1742984898707.png

 

It seems that we don't fully support Einsum currently, so you need to replace the einsum layers (you have multiple ones in your model by simple matrixes operation instead). for example:

 

Equivalent Operations:
Instead of einsum("bhqk,bkhd->bqhd", A, B), use:

import torch

# Example tensors
A = torch.randn(batch, heads, query, key) # (b, h, q, k)
B = torch.randn(batch, key, heads, dim) # (b, k, h, d)

# Transpose B to (b, h, k, d) so that k aligns for matmul
B_transposed = B.permute(0, 2, 1, 3) # (b, h, k, d)

# Perform batched matrix multiplication
result = torch.matmul(A, B_transposed) # (b, h, q, d)

# Swap axes to match expected output shape (b, q, h, d)
result = result.permute(0, 2, 1, 3) # (b, q, h, d)

Have a good day,

Julian


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.

View solution in original post

2 REPLIES 2
Julian E.
ST Employee

Hello @Nephalem ,

 

There is probably something wrong happening during the conversion of the model because of the original shape of your input.

I'll take a look an update you once I know more.

 

Have a good day,

Julian


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.

Hello @Nephalem,

 

Your issue here comes from Einsum layer.

In your case, the operation that does not work is einsum("bhqk,bkhd->bqhd")

 

JulianE_0-1742984898707.png

 

It seems that we don't fully support Einsum currently, so you need to replace the einsum layers (you have multiple ones in your model by simple matrixes operation instead). for example:

 

Equivalent Operations:
Instead of einsum("bhqk,bkhd->bqhd", A, B), use:

import torch

# Example tensors
A = torch.randn(batch, heads, query, key) # (b, h, q, k)
B = torch.randn(batch, key, heads, dim) # (b, k, h, d)

# Transpose B to (b, h, k, d) so that k aligns for matmul
B_transposed = B.permute(0, 2, 1, 3) # (b, h, k, d)

# Perform batched matrix multiplication
result = torch.matmul(A, B_transposed) # (b, h, q, d)

# Swap axes to match expected output shape (b, q, h, d)
result = result.permute(0, 2, 1, 3) # (b, q, h, d)

Have a good day,

Julian


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.