Deep attractors: Where deep learning meets chaos
In nonlinear dynamics, when the state space is thought to be multidimensional but all we have for data is just a univariate time series, one may attempt to reconstruct the...continue reading.
In nonlinear dynamics, when the state space is thought to be multidimensional but all we have for data is just a univariate time series, one may attempt to reconstruct the...continue reading.
PixelCNN is a deep learning architecture – or bundle of architectures – designed to generate highly realistic-looking images. To use it, no reverse-engineering of arXiv papers or search for reference...continue reading.
Compared to other applications, deep learning models might not seem too likely as victims of privacy attacks. However, methods exist to determine whether an entity was used in the training...continue reading.
Deep learning need not be irreconcilable with privacy protection. Federated learning enables on-device, distributed model training; encryption keeps model and gradient updates private; differential privacy prevents the training data from...continue reading.
Deep learning need not be irreconcilable with privacy protection. Federated learning enables on-device, distributed model training; encryption keeps model and gradient updates private; differential privacy prevents the training data from...continue reading.
The term “federated learning” was coined to describe a form of distributed model training where the data remains on client devices, i.e., is never shipped to the coordinating server. In...continue reading.
Kullback-Leibler divergence is not just used to train variational autoencoders or Bayesian networks (and not just a hard-to-pronounce thing). It is a fundamental concept in information theory, put to use...continue reading.
Broadcasting, as done by Python’s scientific computing library NumPy, involves dynamically extending shapes so that arrays of different sizes may be passed to operations that expect conformity – such as...continue reading.
TensorFlow 2.1, released last week, allows for mixed-precision training, making use of the Tensor Cores available in the most recent NVidia GPUs. In this post, we report first experimental results...continue reading.
Differential Privacy guarantees that results of a database query are basically independent of the presence in the data of a single individual. Applied to machine learning, we expect that no...continue reading.
Continuing our tour of applications of TensorFlow Probability (TFP), after Bayesian Neural Networks, Hamiltonian Monte Carlo and State Space Models, here we show an example of Gaussian Process Regression. In...continue reading.
Looking for materials to get started with deep learning from R? This post presents useful tutorials, guides, and background documentation on the new TensorFlow for R website. Advanced users will...continue reading.
In a Bayesian neural network, layer weights are distributions, not tensors. Using tfprobability, the R wrapper to TensorFlow Probability, we can build regular Keras models that have probabilistic layers, and...continue reading.
Part of the r-tensorflow ecosystem, tfprobability is an R wrapper to TensorFlow Probability, the Python probabilistic programming framework developed by Google. We take the occasion of tfprobability’s acceptance on CRAN...continue reading.
Is society ready to deal with challenges brought about by artificially-generated information – fake images, fake videos, fake text? While this post won’t answer that question, it should help form...continue reading.
TensorFlow 2.0 was finally released last week. As R users we have two kinds of questions. First, will my keras code still run? And second, what is it that changes?...continue reading.
TensorFlow Probability, and its R wrapper tfprobability, provide Markov Chain Monte Carlo (MCMC) methods that were used in a number of recent posts on this blog. These posts were directed...continue reading.
Have you ever wondered why you can call TensorFlow – mostly known as a Python framework – from R? If not – that’s how it should be, as the R...continue reading.
In this post we use tfprobability, the R interface to TensorFlow Probability, to model censored data. Again, the exposition is inspired by the treatment of this topic in Richard McElreath’s...continue reading.
Previous posts featuring tfprobability – the R interface to TensorFlow Probability – have focused on enhancements to deep neural networks (e.g., introducing Bayesian uncertainty estimates) and fitting hierarchical models with...continue reading.