Error: keras merge LSTM layers in sum mode2019 Community Moderator ElectionMerge two models - Keras“concat” mode can only merge layers with matching output shapes except for the concat axisRight Way to Input Text Data in Keras Auto EncoderUnevenly stretched sequences with LSTM/GRUWhy use two LSTM layers one after another?input_dim for Dense Layer after LSTM layers KerasMixing Textual Data and Numerical Data (Neural Network)How do I build a permutation invariance neural network in keras?Understanding output of LSTM for regressionError: building keras model using LSTM
Manga about a female worker who got dragged into another world together with this high school girl and she was just told she's not needed anymore
Is there a name of the flying bionic bird?
Landing in very high winds
Does it makes sense to buy a new cycle to learn riding?
extract characters between two commas?
Filling an area between two curves
Typesetting a double Over Dot on top of a symbol
Is window.confirm() accessible?
What do you call something that goes against the spirit of the law, but is legal when interpreting the law to the letter?
Pristine Bit Checking
aging parents with no investments
What do the Banks children have against barley water?
Re-submission of rejected manuscript without informing co-authors
Information to fellow intern about hiring?
Where else does the Shulchan Aruch quote an authority by name?
Doomsday-clock for my fantasy planet
Is domain driven design an anti-SQL pattern?
Find the number of surjections from A to B.
How is it possible for user's password to be changed after storage was encrypted? (on OS X, Android)
Are objects structures and/or vice versa?
Is this food a bread or a loaf?
Prime joint compound before latex paint?
How would photo IDs work for shapeshifters?
Is a vector space a subspace of itself?
Error: keras merge LSTM layers in sum mode
2019 Community Moderator ElectionMerge two models - Keras“concat” mode can only merge layers with matching output shapes except for the concat axisRight Way to Input Text Data in Keras Auto EncoderUnevenly stretched sequences with LSTM/GRUWhy use two LSTM layers one after another?input_dim for Dense Layer after LSTM layers KerasMixing Textual Data and Numerical Data (Neural Network)How do I build a permutation invariance neural network in keras?Understanding output of LSTM for regressionError: building keras model using LSTM
$begingroup$
I want to merge two sequential models in sum mode into one model using keras as:
left = Sequential()
left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
right = Sequential()
right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
model = Sequential()
model.add(Add()([left, right]))
But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.
keras lstm
$endgroup$
add a comment |
$begingroup$
I want to merge two sequential models in sum mode into one model using keras as:
left = Sequential()
left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
right = Sequential()
right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
model = Sequential()
model.add(Add()([left, right]))
But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.
keras lstm
$endgroup$
add a comment |
$begingroup$
I want to merge two sequential models in sum mode into one model using keras as:
left = Sequential()
left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
right = Sequential()
right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
model = Sequential()
model.add(Add()([left, right]))
But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.
keras lstm
$endgroup$
I want to merge two sequential models in sum mode into one model using keras as:
left = Sequential()
left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
right = Sequential()
right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
model = Sequential()
model.add(Add()([left, right]))
But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.
keras lstm
keras lstm
asked Mar 29 at 9:32
shaifali Guptashaifali Gupta
7810
7810
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.
Use functional model (from keras.models import Model
), not Sequential.
Then, merge the models with:
merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])
or whatever your input looks like.
$endgroup$
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
$endgroup$
– Antonio Jurić
Mar 29 at 10:06
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48203%2ferror-keras-merge-lstm-layers-in-sum-mode%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.
Use functional model (from keras.models import Model
), not Sequential.
Then, merge the models with:
merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])
or whatever your input looks like.
$endgroup$
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
$endgroup$
– Antonio Jurić
Mar 29 at 10:06
add a comment |
$begingroup$
The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.
Use functional model (from keras.models import Model
), not Sequential.
Then, merge the models with:
merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])
or whatever your input looks like.
$endgroup$
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
$endgroup$
– Antonio Jurić
Mar 29 at 10:06
add a comment |
$begingroup$
The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.
Use functional model (from keras.models import Model
), not Sequential.
Then, merge the models with:
merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])
or whatever your input looks like.
$endgroup$
The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.
Use functional model (from keras.models import Model
), not Sequential.
Then, merge the models with:
merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])
or whatever your input looks like.
answered Mar 29 at 9:45
Antonio JurićAntonio Jurić
741111
741111
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
$endgroup$
– Antonio Jurić
Mar 29 at 10:06
add a comment |
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
$endgroup$
– Antonio Jurić
Mar 29 at 10:06
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
$endgroup$
– shaifali Gupta
Mar 29 at 10:01
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
$endgroup$
– Antonio Jurić
Mar 29 at 10:03
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.$endgroup$
– Antonio Jurić
Mar 29 at 10:06
$begingroup$
Does this mean that the output of Model is not merging them
Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.$endgroup$
– Antonio Jurić
Mar 29 at 10:06
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48203%2ferror-keras-merge-lstm-layers-in-sum-mode%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown