gensbi.models.flux1.math#

Functions#

apply_rope(xq, xk, freqs_cis)

Apply rotary positional embeddings.

attention(q, k, v[, pe, mask])

Compute attention mechanism.

rope(pos, dim, theta)

Compute rotary positional embeddings.

Module Contents#

gensbi.models.flux1.math.apply_rope(xq, xk, freqs_cis)[source]#

Apply rotary positional embeddings.

Parameters:
  • xq (Array) – Query tensor.

  • xk (Array) – Key tensor.

  • freqs_cis (Array) – Frequency embeddings.

Returns:

Transformed query and key tensors.

Return type:

Tuple[Array, Array]

gensbi.models.flux1.math.attention(q, k, v, pe=None, mask=None)[source]#

Compute attention mechanism.

Parameters:
  • q (Array) – Query tensor.

  • k (Array) – Key tensor.

  • v (Array) – Value tensor.

  • pe (Optional[Array]) – Positional encoding.

  • mask (Optional[Array]) – Attention mask.

Returns:

Attention output.

Return type:

Array

gensbi.models.flux1.math.rope(pos, dim, theta)[source]#

Compute rotary positional embeddings.

Parameters:
  • pos (Array) – Position tensor.

  • dim (int) – Dimension of embeddings.

  • theta (int) – Scaling factor.

Returns:

Rotary embeddings.

Return type:

Array