2

For a personal project, I am interested in benchmarking certain neural network architectures in the context of high-dimensional function approximation. Specifically, I am interested in continuous, smooth, Hölder, and Sobolev functions defined on $[0,1]^d$ in $\mathbb{R}^d$.

  • Does anyone know if a standard list of high-dimensional functions is commonly used in the literature to benchmark such models? For example, in the optimization literature, there is a standard list of functions, such as the ones found here, to benchmark various optimization algorithms.

  • If no such list is available, how should one construct a representative list of functions? This choice will introduce an inductive bias in the problem, so I'd like to ensure the list is as representative as possible.

Thanks!

user82261
  • 121
  • 1

1 Answers1

0

Not an answer, but I don't have credentials to put in comments. I might be uninformed, but representative of the various functions spaces you mentioned. Those sets are defined by restriction properties. How does one characterize or sample them? I am curious. I might never have heard about that. Although I might think that a basis spanning the space might be a way? Or if they have metrics? Sorry, not of much help. But curious about more background implied in your question. I can delete, this is not appropriate.

dbdb
  • 1
  • 2