Google’s DeepMind works to improve Artificial Intelligence
Machine learning and artificial intelligence are increasing, contending with the limits of computing hardware, and it’s causing scientists to rethink how they design neural networks.
In last week research offering form Google called Reformer was clear, now that they have aimed to stuff a natural language program into a single graphics processing chip instead of eight.
In this week Google have offered focused on efficiency, something called Sideways.
Sideways is nothing but machine learning neural nets during their training phase use a forward pass, a transmission of a signal through layers of the network.
A deep learning neural net is doing less than it could be doing at every moment in time during the forward pass and the backward pass, is been noticed by DeepMind unit of Google
In neural net procees the data need to wait until both the forward pass and backward pass have been computed. This because activations of layers of the network that are triggered during the forward pass need to be held onto by the computer so it can use those activations when it goes to compute backpropagation.
The future looks very interesting, as placing different Sideways modules in different GPUs will also significantly reduce memory requirements for training large neural networks.