Keras - How to construct a shared Embedding() Layer for each Input-Neuron

Another Coder

I want to create a deep neural network in keras, where each element of the input layer is "encoded" using the same, shared Embedding()-layer, before it is fed into the deeper layers.

Each input would be a number that defines the type of an object, and the network should learn an embedding that encapsulates some internal representation of "what this object is".

So, if the input layer has X dimensions, and the embedding has Y dimensions, the first hidden layer should consist of X*Y neurons (each input neuron embedded).

Here is a little image that should show the network architecture that I would like to create, where each input-element is encoded using a 3D-Embedding

How can I do this?

Nassim Ben
from keras.layers import Input, Embedding

first_input = Input(shape = (your_shape_tuple) )
second_input = Input(shape = (your_shape_tuple) )
...

embedding_layer = Embedding(embedding_size)

first_input_encoded = embedding_layer(first_input)
second_input_encoded = embedding_layer(second_input)
...

Rest of the model....

The emnedding_layer will have shared weights. You can do this in form of lists of layers if you have a lot of inputs.

If what you want is transforming a tensor of inputs, the way to do it is :

from keras.layers import Input, Embedding

# If your inputs are all fed in one numpy array :
input_layer = Input(shape = (num_input_indices,) )

# the output of this layer will be a 2D tensor of shape (num_input_indices, embedding_size)
embedded_input = Embedding(embedding_size)(input_layer)

Is this what you were looking for?

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

Keras -- Input Shape for Embedding Layer

How to specify an input with a list of arrays to Embedding layer in Keras?

How to use keras embedding layer with 3D tensor input?

Keras - How to use the learned Embedding() Layer for Input and Output?

Apply a shared Embedding layer on a set of documents in keras

Keras : How to merge a dense layer and an embedding layer

How to input a list to the embedding layer?

Keras: functional API what should the Input layer be for the embedding layer?

how to add tanh to one embedding layer in keras

How to model a shared layer in keras?

ELMo Embedding layer with Keras

Keras- Embedding layer

Keras shared layer with different input sizes

Can a matrix be given as input to Keras's embedding layer?

how to concatenate pre trained embedding layer and Input layer

How to specify input layer with Keras

How to train a model with only an Embedding layer in Keras and no labels

How does mask_zero in Keras Embedding layer work?

How to get word vectors from Keras Embedding Layer

How to find similar words in Keras Word Embedding layer

how to build Sequence-to-sequence autoencoder in keras with embedding layer?

How do I correctly use Keras Embedding layer?

Seralizing a keras model with an embedding layer

keras understanding Word Embedding Layer

Keras LSTM autoencoder with embedding layer

Adding bias to embedding layer in Keras

How do neural network models learn different weights for each of the neuron in a single layer?

How do I create a Keras Embedding layer from a pre-trained word embedding dataset?

How can I plot the variation of the value of each input layer with the epoch number in Keras?