Rwn - Choices [fs004] -
Once importance is calculated, reduce the "Choices" set to the most impactful variables.
Before feeding variables into the RWN, the features must be uniform to prevent the weights from being biased by large-magnitude variables. RWN - Choices [FS004]
For partial label learning or complex selection tasks (as specified in [FS004] workflows), derive a disambiguated set. Once importance is calculated, reduce the "Choices" set
: Replace null values with the mean/median for continuous data or the mode for categorical data. Normalization : Scale all features to a range of using Min-Max scaling or Z-score standardization. 2. Disambiguated Training Set Preparation : Replace null values with the mean/median for
: Apply a normalization formula (e.g., Eq. 14 in standard FS protocols) to ensure weights are comparable across different nodes or decision trees. 4. Selection via Subset Optimization
: Rank features by their FIM or SHAP values. Thresholding : Select the top features (or those exceeding a specific threshold ) to obtain the target subset.
: Use the iterative process to refine labels, ensuring each input is paired with a high-confidence target Matrix Construction : Organize your features into a matrix where represents the number of samples and the initial choice of features. 3. Feature Importance Calculation (FIM)