Did your hair move much in 2025? Mine sure didn’t.
Abstract: With the increasing availability of computational and data resources, numerous powerful pre-trained language models (PLMs) have emerged for natural language processing tasks. However, how to ...
UniPre3D is the first unified pre-training method for 3D point clouds that effectively handles both object- and scene-level data through cross-modal Gaussian splatting. Our proposed pre-training task ...
Abstract: Web attack is a major threat to cyberspace security, so web attack detection models have become a critical task. Traditional supervised learning methods learn features of web attacks with ...
Quantization plays a crucial role in deploying Large Language Models (LLMs) in resource-constrained environments. However, the presence of outlier features significantly hinders low-bit quantization.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results