Stars
Data, Benchmarks, and methods submitted to the M4 forecasting competition
This repository contains the source code and dataset link mentioned in DAC 2023 accepted paper "Faster and Stronger Lossless Compression with Optimized Autoregressive Framework".
A Library for Advanced Deep Time Series Models.
ForecastGrapher: Redefining Multivariate Time Series Forecasting with Graph Neural Networks
Dzip: improved general-purpose lossless compression based on novel neural network modeling
This repository contains the source code and dataset link mentioned in WWW 2022 accepted paper "TRACE:A Fast Transformer-based General-Purpose LosslessCompressor".
The goal of the library is to help with research in the area of data compression. This is not meant to be fast or efficient implementation, but rather for educational purpose
A Python package providing buffer compression and transformation codecs for use in data storage and communication applications.
Prompt工程师指南,源自英文版,但增加了AIGC的prompt部分,为了降低同学们的学习门槛,翻译更新
Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs (EMNLP 2020)
[NeurIPS 2022] The official PyTorch implementation of "Neural Temporal Walks: Motif-Aware Representation Learning on Continuous-Time Dynamic Graphs"
[Paper][ACL 2024 Findings] Knowledgeable Preference Alignment for LLMs in Domain-specific Question Answering
[WWWJ 2024] LLMs for Knowledge Graph Construction and Reasoning: Recent Capabilities and Future Opportunities
The online version is temporarily unavailable because we cannot afford the key. You can clone and run it locally. Note: we set defaul openai key. If keys exceed plan and are invalid, please tell us…
LlamaIndex is the leading framework for building LLM-powered agents over your data.
Package for causal inference in graphs and in the pairwise settings. Tools for graph structure recovery and dependencies are included.
The code and data for "StructGPT: A general framework for Large Language Model to Reason on Structured Data"
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
Tuning LLMs with no tears💦; Sample Design Engineering (SDE) for more efficient downstream-tuning.
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Reimplementation of AAAI21 paper "Beyond Low-frequency Information in Graph Convolutional Networks" based on PyTorch and PyTorch Geometric (PyG).
Boost learning for GNNs from the graph structure under challenging heterophily settings. (NeurIPS'20)
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
2018-CCF大数据与计算智能大赛-面向电信行业存量用户的智能套餐个性化匹配模型联通赛-复赛第二名-【多分类,embedding】