Skip to content

AldeaTeam/dataset

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

dataset

完整的QA对包含三部分内容,input:是你根据哪段文本提出的问题,question:是提出的问题,:是问题的答案。(中文,英文均可)

具体的例子如下:(注意:一个QA对在一行)

{"input": "The paper proposes using gradient descent to learn hyperparameters for machine learning algorithms, such as the learning rate, and to recursively learn hyper-hyperparameters as well. This approach reduces the burden on users to search for optimal values and can be applied to various optimizers, including Adam. The paper presents a method for computing the step-size update rule automatically using reverse-mode automatic differentiation. The approach is shown to be scalable and less sensitive to the choice of top-level hyperparameters.", "question": "这篇论文创新点是什么", "": "The innovation of this paper is the development of a method for automatically optimizing hyperparameters in gradient-based optimization algorithms such as SGD, AdaGrad, AdaDelta, and Adam. The authors demonstrate the effectiveness of their approach by optimizing all four hyperparameters of the Adam optimizer, which was previously not possible, and by recursively stacking hyperoptimizers to create higher-order hyperoptimizers that are more robust to the choice of top-level hyperparameters. The paper also discusses the challenges of computing the partial derivatives of the step-size update rule with respect to the hyperparameters and proposes a method for computing these derivatives automatically using backwards automatic differentiation."}

.....

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •