Functional Similarity Metric for Neural Networks: Overcoming Parametric Ambiguity via Activation Region Analysis
arXiv cs.LG / 4/21/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper addresses representational ambiguity in ReLU neural networks, where different weight configurations can implement the same functions due to neuron permutation and positive diagonal scaling symmetries.
- It introduces a stable canonical form and a functional similarity metric that compares networks by analyzing activation-region topology rather than raw parameters.
- The method removes scaling ambiguity using L2-normalization of weight vectors with layer compensation, then creates discrete activation-region “signatures” by evaluating binary patterns over sampled data.
- To efficiently compare large binary signatures, it uses Locality-Sensitive Hashing with MinHash to approximate the Jaccard index, and performs cross-network neuron matching via the Hungarian algorithm.
- The authors show theoretically and experimentally that the metric reduces neuron “flickering” and remains robust under small weight perturbations, supporting model merging, pruning assessment, transfer learning, and Explainable AI.
Related Articles

Every time a new model comes out, the old one is obsolete of course
Reddit r/LocalLLaMA

We built it during the NVIDIA DGX Spark Full-Stack AI Hackathon — and it ended up winning 1st place overall 🏆
Dev.to

Stop Losing Progress: Setting Up a Pro Jupyter Workflow in VS Code (No More Colab Timeouts!)
Dev.to

Building AgentOS: Why I’m Building the AWS Lambda for Insurance Claims
Dev.to

Where we are. In a year, everything has changed. Kimi - Minimax - Qwen - Gemma - GLM
Reddit r/LocalLLaMA