Error: keras merge LSTM layers in sum mode2019 Community Moderator ElectionMerge two models - Keras“concat” mode can only merge layers with matching output shapes except for the concat axisRight Way to Input Text Data in Keras Auto EncoderUnevenly stretched sequences with LSTM/GRUWhy use two LSTM layers one after another?input_dim for Dense Layer after LSTM layers KerasMixing Textual Data and Numerical Data (Neural Network)How do I build a permutation invariance neural network in keras?Understanding output of LSTM for regressionError: building keras model using LSTM

Manga about a female worker who got dragged into another world together with this high school girl and she was just told she's not needed anymore

Is there a name of the flying bionic bird?

Landing in very high winds

Does it makes sense to buy a new cycle to learn riding?

extract characters between two commas?

Filling an area between two curves

Typesetting a double Over Dot on top of a symbol

Is window.confirm() accessible?

What do you call something that goes against the spirit of the law, but is legal when interpreting the law to the letter?

Pristine Bit Checking

aging parents with no investments

What do the Banks children have against barley water?

Re-submission of rejected manuscript without informing co-authors

Information to fellow intern about hiring?

Where else does the Shulchan Aruch quote an authority by name?

Doomsday-clock for my fantasy planet

Is domain driven design an anti-SQL pattern?

Find the number of surjections from A to B.

How is it possible for user's password to be changed after storage was encrypted? (on OS X, Android)

Are objects structures and/or vice versa?

Is this food a bread or a loaf?

Prime joint compound before latex paint?

How would photo IDs work for shapeshifters?

Is a vector space a subspace of itself?



Error: keras merge LSTM layers in sum mode



2019 Community Moderator ElectionMerge two models - Keras“concat” mode can only merge layers with matching output shapes except for the concat axisRight Way to Input Text Data in Keras Auto EncoderUnevenly stretched sequences with LSTM/GRUWhy use two LSTM layers one after another?input_dim for Dense Layer after LSTM layers KerasMixing Textual Data and Numerical Data (Neural Network)How do I build a permutation invariance neural network in keras?Understanding output of LSTM for regressionError: building keras model using LSTM










0












$begingroup$


I want to merge two sequential models in sum mode into one model using keras as:



left = Sequential()
left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
right = Sequential()
right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
model = Sequential()
model.add(Add()([left, right]))


But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.










share|improve this question









$endgroup$
















    0












    $begingroup$


    I want to merge two sequential models in sum mode into one model using keras as:



    left = Sequential()
    left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
    right = Sequential()
    right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
    model = Sequential()
    model.add(Add()([left, right]))


    But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.










    share|improve this question









    $endgroup$














      0












      0








      0





      $begingroup$


      I want to merge two sequential models in sum mode into one model using keras as:



      left = Sequential()
      left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
      right = Sequential()
      right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
      model = Sequential()
      model.add(Add()([left, right]))


      But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.










      share|improve this question









      $endgroup$




      I want to merge two sequential models in sum mode into one model using keras as:



      left = Sequential()
      left.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
      right = Sequential()
      right.add(LSTM(64,activation='sigmoid',stateful=True,batch_input_shape=(10,look_back,dim)))
      model = Sequential()
      model.add(Add()([left, right]))


      But the statement model.add(Add()[left,right]) gives error: Layer add was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.







      keras lstm






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 29 at 9:32









      shaifali Guptashaifali Gupta

      7810




      7810




















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.



          Use functional model (from keras.models import Model), not Sequential.



          Then, merge the models with:



          merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])


          or whatever your input looks like.






          share|improve this answer









          $endgroup$












          • $begingroup$
            In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
            $endgroup$
            – shaifali Gupta
            Mar 29 at 10:01










          • $begingroup$
            Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:03










          • $begingroup$
            Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:06











          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "557"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48203%2ferror-keras-merge-lstm-layers-in-sum-mode%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.



          Use functional model (from keras.models import Model), not Sequential.



          Then, merge the models with:



          merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])


          or whatever your input looks like.






          share|improve this answer









          $endgroup$












          • $begingroup$
            In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
            $endgroup$
            – shaifali Gupta
            Mar 29 at 10:01










          • $begingroup$
            Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:03










          • $begingroup$
            Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:06















          0












          $begingroup$

          The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.



          Use functional model (from keras.models import Model), not Sequential.



          Then, merge the models with:



          merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])


          or whatever your input looks like.






          share|improve this answer









          $endgroup$












          • $begingroup$
            In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
            $endgroup$
            – shaifali Gupta
            Mar 29 at 10:01










          • $begingroup$
            Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:03










          • $begingroup$
            Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:06













          0












          0








          0





          $begingroup$

          The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.



          Use functional model (from keras.models import Model), not Sequential.



          Then, merge the models with:



          merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])


          or whatever your input looks like.






          share|improve this answer









          $endgroup$



          The error says what's the problem: the method expects a Tensors, but you are giving a Sequential model object.



          Use functional model (from keras.models import Model), not Sequential.



          Then, merge the models with:



          merged_models = Model(inputs=[first_model_input, second_model_input], outputs=[first_model_output, second_model_output])


          or whatever your input looks like.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Mar 29 at 9:45









          Antonio JurićAntonio Jurić

          741111




          741111











          • $begingroup$
            In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
            $endgroup$
            – shaifali Gupta
            Mar 29 at 10:01










          • $begingroup$
            Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:03










          • $begingroup$
            Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:06
















          • $begingroup$
            In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
            $endgroup$
            – shaifali Gupta
            Mar 29 at 10:01










          • $begingroup$
            Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:03










          • $begingroup$
            Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
            $endgroup$
            – Antonio Jurić
            Mar 29 at 10:06















          $begingroup$
          In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
          $endgroup$
          – shaifali Gupta
          Mar 29 at 10:01




          $begingroup$
          In my case, I need to pass the output of Add() to next layers after merging. For example Dense. If I use: merged_models = Model(inputs=[left.input, right.input], outputs=[left.output,right.output]) dd=Dense(dm, activation='linear')(merged_models.output) It again gives error that dense was expecting one input but it got two. Does this mean that the output of Model is not merging them
          $endgroup$
          – shaifali Gupta
          Mar 29 at 10:01












          $begingroup$
          Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
          $endgroup$
          – Antonio Jurić
          Mar 29 at 10:03




          $begingroup$
          Then your approach (and the question) is wrong. Just don't create the models while you are not finished with defining all the layers. That way you won't get errors about expecting the Tensors, but something else was given. And then create the model at the end.
          $endgroup$
          – Antonio Jurić
          Mar 29 at 10:03












          $begingroup$
          Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
          $endgroup$
          – Antonio Jurić
          Mar 29 at 10:06




          $begingroup$
          Does this mean that the output of Model is not merging them Yes, the model is not merging the outputs, that's not the purpose of Functional model constructor. To merge the layers, use layers, not model constructor.
          $endgroup$
          – Antonio Jurić
          Mar 29 at 10:06

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48203%2ferror-keras-merge-lstm-layers-in-sum-mode%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

          Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

          Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High