PaperHub

Han Liu

~Han_Liu4

40
论文总数
20.0
年均投稿
5.3
平均评分
接收情况19/40
会议分布
ICLR
27
NeurIPS
9
ICML
4

发表论文 (40 篇)

202532

4.0
5

Explainable Sequential Optimization

ICLR 2025withdrawn
1.5
4

A Comprehensive Framework for Benchmarking Algorithms Across Hyperparameter Spaces

ICLR 2025withdrawn
-

RDAS: A Low Latency and High Throughput Raw Data Engine for Machine Learning Systems

ICLR 2025desk_rejected
-

MambaFormer-MOE: Mamba-Transformer-based Mixture-of-Experts for Time Series Prediction

ICLR 2025desk_rejected
6.8
4

MetaFind: Scene-Aware 3D Asset Retrieval for Coherent Metaverse Scene Generation

NeurIPS 2025Poster
4.3
4

On Differentially Private String Distances

ICLR 2025withdrawn
3.5
4

HeteGraph-Mamba: Heterogeneous Graph Learning via Selective State Space Model

ICLR 2025withdrawn
2.0
4

SWGA: A Distributed Hyperparameter Search Method for Time Series Prediction Models

ICLR 2025Rejected
7.0
3

Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models

ICLR 2025Poster
6.4
4

Attention Mechanism, Max-Affine Partition, and Universal Approximation

NeurIPS 2025Poster
6.4
4

Investigating Hallucinations of Time Series Foundation Models through Signal Subspace Analysis

NeurIPS 2025Poster
5.3
4

Automated Feature Engineering by Prompting

ICLR 2025Rejected
4.5
4

AlignAb: Pareto-Optimal Energy Alignment for Designing Nature-Like Antibodies

ICLR 2025Rejected
5.8
4

Computational Limits of Low-Rank Adaptation (LoRA) Fine-Tuning for Transformer Models

ICLR 2025Poster
7.3
4

Pareto-Optimal Energy Alignment for Designing Nature-Like Antibodies

NeurIPS 2025Poster
5.5
4

BOOST: Enhanced Jailbreak of Large Language Model via Slient eos Tokens

ICLR 2025Rejected
6.1
4

Nonparametric Modern Hopfield Models

ICML 2025Poster
6.1
4

In-Context Deep Learning via Transformer Models

ICML 2025Poster
6.1
4

In-Context Learning as Conditioned Associative Memory Retrieval

ICML 2025Poster
5.3
4

Pretrained Transformers are Deep Optimizers: Provable In-Context Learning for Deep Model Training

ICLR 2025Rejected
6.3
4

Fundamental Limits of Prompt Tuning Transformers: Universality, Capacity and Efficiency

ICLR 2025Poster
4.2
5

Can Transformers Perform PCA ?

ICLR 2025withdrawn
3.5
4

Transformers Learn Bayesian Networks Autoregressively In-Context

ICLR 2025withdrawn
6.3
4

On Statistical Rates of Conditional Diffusion Transformers: Approximation, Estimation and Minimax Optimality

ICLR 2025Poster
3.5
4

GenomeOcean: Efficient Foundation Model for Genome Generation

ICLR 2025Rejected
4.0
4

Decoupled Alignment for Robust Plug-and-Play Adaptation

ICLR 2025withdrawn
4.3
4

Codev-Bench: How Do LLMs Understand Developer-Centric Code Completion?

ICLR 2025Rejected
6.6
4

Fast and Low-Cost Genomic Foundation Models via Outlier Removal

ICML 2025Poster
5.7
3

DNABERT-S: Pioneering Species Differentiation with Species-Aware DNA Embeddings

ICLR 2025Rejected
6.4
5

High-Order Flow Matching: Unified Framework and Sharp Statistical Rates

NeurIPS 2025Poster
5.5
4

SPARQ: Outlier-free SpeechLM with Fast Adaptation and Robust Quantization

ICLR 2025Rejected
4.2
5

A Benchmark Study For Limit Order Book (LOB) Models and Time Series Forecasting Models on LOB Data

ICLR 2025Rejected

20248