WindowAttention class

Window-based multi-head self attention (W-MSA / SW-MSA).

Attention is computed within local windows of size windowSize × windowSize. Relative position bias is added to attention scores.

Constructors

WindowAttention({required int dim, required int windowSize, required int numHeads})

Properties

dim int
final
hashCode int
The hash code for this object.
no setterinherited
headDim int
final
numHeads int
final
proj Linear
getter/setter pair
qkv Linear
getter/setter pair
relativePositionBiasTable Tensor
Relative position bias table: (2windowSize-1)(2*windowSize-1), numHeads
getter/setter pair
relativePositionIndex Tensor
Relative position index: (windowSizewindowSize, windowSizewindowSize)
getter/setter pair
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
windowSize int
final

Methods

forward(Tensor x, {Tensor? mask}) Tensor
Forward pass.
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited