Fixing Unsupervised Hyperbolic Contrastive Loss [D]

Reddit r/MachineLearning / 5/5/2026

💬 OpinionDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research

Key Points

  • The post describes an attempt to implement an “Unsupervised Hyperbolic Contrastive Loss” for ImageNet-1k, but the author observes that a simpler Euclidean unsupervised contrastive loss performs substantially better.
  • The author shares a PyTorch-style loss implementation that computes pairwise Lorentzian (hyperbolic) distances on a manifold using `dist`, then forms logits as `-distance / temperature` and applies cross-entropy with one-to-one (index-based) labels.
  • Reported 1-NN accuracy is lower for hyperbolic embeddings (57%) than for a cosine baseline (64%) under the stated batch size (2048) and learning rate (1e-4).
  • The author asks for help understanding what is wrong with the hyperbolic loss setup, including correct use of exponential map (`expmap`) and projection (`projx`) to keep embeddings on the Lorentzian manifold.
  • The key takeaway is that correctly formulating hyperbolic contrastive learning objectives and their numerical/manifold details is non-trivial, and small implementation choices can degrade performance versus Euclidean or cosine alternatives.

Hello all,

I am trying to implement Unsupervised Hyperbolic Contrastive Loss on the ImageNet-1k dataset. My results show that simple Euclidean unsupervised contrastive loss is much better than the hyperbolic version. Please help me understand the problem. I am using expmap() and projx() to ensure the embedding is on the Lorentzian manifold. Below is my code -

def hb_contrastive_loss(z, z1, model, temp=0.07):

z_to_neighbor = model.manifold.dist(z.unsqueeze(1), z1.unsqueeze(0))

labels = torch.arange(z.size(0), device=z.device)

logits = -z_to_neighbor / temp

loss = F.cross_entropy(logits, labels)

return loss

Current results for 1-NN accuracy:

Hyperbolic = 57%
Cosine = 64%

More information (if relevant):
Batch size = 2048
LR = 1e-4

submitted by /u/arjun_r_kaushik
[link] [comments]