HGNet: Scalable Foundation Model for Automated Knowledge Graph Generation from Scientific Literature
arXiv:2603.23136v1 Announce Type: new Abstract: Automated knowledge graph (KG) construction is essential for navigating the rapidly expanding body of scientific literature. However, existing approaches struggle to recognize long multi-word entities, often fail to generalize across domains, and typically overlook the hierarchical nature of scientific knowledge. While general-purpose large language models (LLMs) offer adaptability, they are computationally expensive and yield inconsistent accuracy on specialized tasks. As a result, current KGs are shallow and inconsistent, limiting their utility for exploration and synthesis. We propose a two-stage framework for scalable, zero-shot scientific KG construction. The first stage, Z-NERD, introduces (i) Orthogonal Semantic Decomposition (OSD), which promotes domain-agnostic entity recognition by isolating semantic "turns" in text, and (ii) a Multi-Scale TCQK attention mechanism that captures coherent multi-word entities through n-gram-aware
arXiv:2603.23136v1 Announce Type: new Abstract: Automated knowledge graph (KG) construction is essential for navigating the rapidly expanding body of scientific literature. However, existing approaches struggle to recognize long multi-word entities, often fail to generalize across domains, and typically overlook the hierarchical nature of scientific knowledge. While general-purpose large language models (LLMs) offer adaptability, they are computationally expensive and yield inconsistent accuracy on specialized tasks. As a result, current KGs are shallow and inconsistent, limiting their utility for exploration and synthesis. We propose a two-stage framework for scalable, zero-shot scientific KG construction. The first stage, Z-NERD, introduces (i) Orthogonal Semantic Decomposition (OSD), which promotes domain-agnostic entity recognition by isolating semantic "turns" in text, and (ii) a Multi-Scale TCQK attention mechanism that captures coherent multi-word entities through n-gram-aware attention heads. The second stage, HGNet, performs relation extraction with hierarchy-aware message passing, explicitly modeling parent, child, and peer relations. To enforce global consistency, we introduce two complementary objectives: a Differentiable Hierarchy Loss to discourage cycles and shortcut edges, and a Continuum Abstraction Field (CAF) Loss that embeds abstraction levels along a learnable axis in Euclidean space. This is the first approach to formalize hierarchical abstraction as a continuous property within standard Euclidean embeddings, offering a simpler alternative to hyperbolic methods. We release SPHERE (https://github.com/basiralab/SPHERE), a multi-domain benchmark for hierarchical relation extraction. Our framework establishes a new state of the art on SciERC, SciER, and SPHERE, improving NER by 8.08% and RE by 5.99% on out-of-distribution tests. In zero-shot settings, gains reach 10.76% for NER and 26.2% for RE.
Executive Summary
The article HGNet introduces a novel two-stage framework for scalable, zero-shot knowledge graph (KG) generation from scientific literature, addressing critical gaps in existing approaches. The first stage, Z-NERD, innovates with Orthogonal Semantic Decomposition (OSD) and a Multi-Scale TCQK attention mechanism to better identify multi-word entities and domain-agnostic patterns. The second stage, HGNet, employs hierarchy-aware message passing and introduces complementary consistency objectives—Differentiable Hierarchy Loss and Continuum Abstraction Field (CAF) Loss—to enforce global consistency and model hierarchical abstraction as a continuous property via Euclidean embeddings. These innovations yield measurable improvements in NER (8.08%) and RE (5.99%) on out-of-distribution tests, and up to 26.2% in RE under zero-shot conditions, establishing a new state of the art. The release of SPHERE as a benchmark further enhances reproducibility and applicability.
Key Points
- ▸ Introduction of OSD and TCQK attention for improved entity recognition
- ▸ Hierarchy-aware message passing in HGNet for structured KG construction
- ▸ Use of complementary losses to enforce consistency and abstraction modeling
Merits
Innovation
HGNet introduces novel mechanisms (OSD, TCQK, CAF Loss) that address prior limitations in entity recognition and hierarchical modeling, offering a scalable, zero-shot solution.
Demerits
Computational Complexity
While scalable, the multi-stage architecture involving attention mechanisms and loss formulations may introduce computational overhead, potentially limiting real-time applicability without optimization.
Expert Commentary
HGNet represents a significant advancement in automated KG generation by bridging the gap between general-purpose LLMs and specialized scientific KG systems. The integration of OSD with domain-agnostic semantic decomposition is particularly compelling, as it enables more robust entity identification without prior domain adaptation. Furthermore, the CAF Loss’s formalization of abstraction as a continuous property within Euclidean space marks a conceptual leap—avoiding the computational and interpretability challenges often associated with hyperbolic embeddings. The benchmark release, SPHERE, is a commendable contribution to the field, offering a standardized evaluation framework that will likely become a reference point for future work. While computationally intensive, the proposed architecture is well-justified given the gains in accuracy and consistency. Overall, HGNet is a foundational contribution that redefines the standards for scalable, hierarchical KG construction in scientific domains.
Recommendations
- ✓ 1. Academic institutions should adopt SPHERE as a benchmark for evaluating hierarchical KG systems to ensure comparability and rigor.
- ✓ 2. Researchers should explore lightweight variants of the HGNet architecture for resource-constrained environments, potentially leveraging distillation or parameter-efficient fine-tuning to maintain performance gains.
Sources
Original: arXiv - cs.CL