Copy
Twitter
Machine Learning Weekly Digest.
Welcome to this week of the Best of Machine Learning Digest. In this weekly newsletter, we resurface some of the best resources in Machine Learning posted in the past week. This time, we've gotten 68 submissions, including 3 papers.
This newsletter is sponsored by no one ;). Let's change that.

Papers

This week, 3 Papers were posted on Best of ML. In the following, we're showing you the Top 3 posts of this week.
ML-SIM: A deep neural network for reconstruction of structured illumination microscopy images
 
Structured illumination microscopy (SIM) has become an important technique for optical super-resolution imaging because it allows a doubling of image resolution at speeds compatible for live-cell imaging. However, the reconstruction of SIM images is often slow and prone to artefacts. Here we propose a versatile reconstruction method, ML-SIM, which makes use of machine learning. The model is an end-to-end deep residual neural network that is trained on a simulated data set to be free of common SIM artefacts. ML-SIM is thus robust to noise and irregularities in the illumination patterns of the raw SIM input frames. The reconstruction method is widely applicable and does not require the acquisition of experimental training data. Since the training data are generated from simulations of the SIM process on images from generic libraries the method can be efficiently adapted to specific experimental SIM implementations.
 
A Survey of Deep Learning for Scientific Discovery
 
Over the past few years, we have seen fundamental breakthroughs in core problems in machine learning, largely driven by advances in deep neural networks. At the same time, the amount of data collected in a wide array of scientific domains is dramatically increasing in both size and complexity. Taken together, this suggests many exciting opportunities for deep learning applications in scientific settings. But a significant challenge to this is simply knowing where to start. The sheer breadth and diversity of different deep learning techniques makes it difficult to determine what scientific problems might be most amenable to these methods, or which specific combination of methods might offer the most promising first approach.
 
Learning to Simulate Complex Physics with Graph Networks
 
Here we present a general framework for learning simulation, and provide a single model implementation that yields state-of-the-art performance across a variety of challenging physical domains, involving fluids, rigid solids, and deformable materials interacting with one another. Our framework---which we term "Graph Network-based Simulators" (GNS)---represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing. Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time. Our model was robust to hyperparameter choices across various evaluation metrics: the main determinants of long-term performance were the number of message-passing steps, and mitigating the accumulation of error by corrupting the training data with noise. Our GNS framework is the most accurate general-purpose learned physics simulator to date, and holds promise for solving a wide range of complex forward and inverse problems.
 

Projects

This week, 23 Projects were posted on Best of ML. In the following, we're showing you the Top 3 posts of this week.
A doc to collect and share informative blog posts on deep learning and reinforcement learning
 
Stepping into the field of machine learning, I've realized that sometimes an informative blog post could serve as an extremely helpful friend to guide you through learning and researching. So I've decided to gather all the informative and well-explained blog posts I've ever read and hoping to get some more from you guys :)
 
Optimized Vanilla Transformer as Strong Baseline in Abstractive Summarization. (Tensorflow)
 
Tensorflow official model for Abstractive Summarization is optimized by changing learning rate schedule and appying trigram blocking and so on.
 
torchlayers: Shape inference for PyTorch (like in Keras) + new SoTA layers!
 
torchlayers is a library based on PyTorch providing automatic shape and dimensionality inference of torch.nn layers + additional building blocks featured in current SOTA architectures (e.g. Efficient-Net).
 

Blog Posts

This week, 38 Blog Posts were posted on Best of ML. In the following, we're showing you the Top 3 posts of this week.
A Start-to-Finish Guide to Building Deep Neural Networks in Keras
 
Learning deep learning is daunting; so libraries like Keras that make it easy are helpful. In this article, I outline, explain, and provide code for 7 steps in building an image recognition deep convolutional neural network in Keras.
 
Painting Pixel Art With Machine Learning
 
The above sprites come from the Trajes Fatais: Suits of Fate game, which I work on as the lead developer. Long story short, each sprite takes about one hour to be drawn, and each character takes up to five hundred sprites, on average. In “Towards Machine-Learning Assisted Asset Generation for Games: A Study on Pixel Art Sprite Sheets,” we explore the Pix2Pix architecture to automate the sprite production pipeline, reducing the average time taken per sprite by 15 minutes (~25%). This is our first published work on sprite generation, and we expect to improve it further in the future.
 
Tools and tips for remote machine learning work
 
A summary of tools for distributed or remote machine learning work. Helpful for those working as a distributed team or interacting with a client remotely.
 






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
Monn Ventures · Winterthurerstrasse 649 · Zürich 8051 · Switzerland

Email Marketing Powered by Mailchimp