Différences
Ci-dessous, les différences entre deux révisions de la page.
Les deux révisions précédentes Révision précédente Prochaine révision | Révision précédente | ||
groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:main-page [2020/06/10 15:04] lakhdoun |
groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:main-page [2021/03/17 19:29] (Version actuelle) lakhdoun |
||
---|---|---|---|
Ligne 10: | Ligne 10: | ||
===== Hyperparameter optimization of deep neural networks using mesh adaptive direct search ===== | ===== Hyperparameter optimization of deep neural networks using mesh adaptive direct search ===== | ||
- | The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time consuming process that is often described as a ``dark art''. This explains the necessity of automating the calibration of these hyperparameters. Derivative-free optimization is a field that develops methods designed to optimize time consuming functions without relying on derivatives. | + | {{:groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:hypernomad_acc-1.png?nolink&400 |}} |
+ | |||
+ | **Abstract: ** The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time consuming process that is often described as a ``dark art''. This explains the necessity of automating the calibration of these hyperparameters. Derivative-free optimization is a field that develops methods designed to optimize time consuming functions without relying on derivatives. | ||
This work introduces the HyperNOMAD package, an extension of the NOMAD software that applies the MADS algorithm to simultaneously tune the hyperparameters responsible for both the architecture and the learning process of a deep neural network (DNN), and that allows for an important flexibility in the exploration of the search space by taking advantage of categorical variables. | This work introduces the HyperNOMAD package, an extension of the NOMAD software that applies the MADS algorithm to simultaneously tune the hyperparameters responsible for both the architecture and the learning process of a deep neural network (DNN), and that allows for an important flexibility in the exploration of the search space by taking advantage of categorical variables. | ||
This new approach is tested on the MNIST and CIFAR-10 data sets and achieves results comparable to the current state of the art. | This new approach is tested on the MNIST and CIFAR-10 data sets and achieves results comparable to the current state of the art. | ||
+ | |||
+ | {{:groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:comparison_mnist_default-1.png?nolink&300| }} {{ :groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:comparison_cifar10_default_2-1.png?nolink&300|}} | ||
The manuscript can be found here [[https://arxiv.org/abs/1907.01698|HyperNOMAD_paper]] | The manuscript can be found here [[https://arxiv.org/abs/1907.01698|HyperNOMAD_paper]] | ||
Ligne 20: | Ligne 24: | ||
===== Tuning a variational autoencoder for data accountability problem in the Mars Science Laboratory ground data system ===== | ===== Tuning a variational autoencoder for data accountability problem in the Mars Science Laboratory ground data system ===== | ||
- | The Mars Curiosity rover is frequently sending back telemetry data that goes through a pipeline of systems before reaching its final destination at the Mars science laboratory making it prone to volume loss and data corruption. A ground data system analysis (GDSA) team is charged with the monitoring of this flow of information and the detection of the anomalous data in order to request a re-transmission when necessary. This work presents a derivative-free optimization method for tuning the architecture and hyperparameters of a variational autoencoder trained to detect the data with missing patches in order to assist the GDSA team in their mission. | + | {{:groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:vae_arch_1_.png?nolink&400 |}} |
+ | |||
+ | **Abstract: ** The Mars Curiosity rover is frequently sending back telemetry data that goes through a pipeline of systems before reaching its final destination at the Mars science laboratory making it prone to volume loss and data corruption. A ground data system analysis (GDSA) team is charged with the monitoring of this flow of information and the detection of the anomalous data in order to request a re-transmission when necessary. This work presents a derivative-free optimization method for tuning the architecture and hyperparameters of a variational autoencoder trained to detect the data with missing patches in order to assist the GDSA team in their mission. | ||
+ | |||
+ | |||
+ | {{:groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:benchmark_1_.png?nolink&300|}} {{ :groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:benchmark_bad_init.png?nolink&300|}} | ||
+ | |||
The manuscript can be found here [[https://arxiv.org/abs/2006.03962|Tuning VAE]] | The manuscript can be found here [[https://arxiv.org/abs/2006.03962|Tuning VAE]] | ||
Ligne 31: | Ligne 42: | ||
===== Meetings ===== | ===== Meetings ===== | ||
* [[groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:reunion-2020-05-11|Meeting of the 11.05.2020]] | * [[groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:reunion-2020-05-11|Meeting of the 11.05.2020]] | ||
+ | * [[groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:reunion-2020-06-16|Meeting of the 16.06.2020]] | ||
+ | * [[groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:reunion-2020-07-06|Meeting of the 06.07.2020]] | ||
+ | * [[groupe-dfo-bbo:acteurs:students:doc:dounia-lakhmiri:reunion-2021-03-22|Meeting of the 22.03.2021]] | ||