gensbi.models.flux1.math#
Functions#
|
Apply rotary positional embeddings. |
|
Compute attention mechanism. |
|
Compute rotary positional embeddings. |
Module Contents#
- gensbi.models.flux1.math.apply_rope(xq, xk, freqs_cis)[source]#
Apply rotary positional embeddings.
- Parameters:
xq (Array) – Query tensor.
xk (Array) – Key tensor.
freqs_cis (Array) – Frequency embeddings.
- Returns:
Transformed query and key tensors.
- Return type:
Tuple[Array, Array]
- gensbi.models.flux1.math.attention(q, k, v, pe=None, mask=None)[source]#
Compute attention mechanism.
- Parameters:
q (Array) – Query tensor.
k (Array) – Key tensor.
v (Array) – Value tensor.
pe (Optional[Array]) – Positional encoding.
mask (Optional[Array]) – Attention mask.
- Returns:
Attention output.
- Return type:
Array