Why is MLP working similar to RNN for text generation2019 Community Moderator Electionpython - What is the format of the WAV file for a Text to Speech Neural Network?how can I solve label shape problem in tensorflow when using one-hot encoding?Why are RNN/LSTM preferred in time series analysis and not other NN?Suspected Exploding Gradient in Character Generator LSTMUsing RNN (LSTM) for Gesture Recognition SystemTimeDistributed with different input / output sequence lengthNeed to make an multivariate RNN, confused about input shape?Can a neural network be created that will derive the pythogorean theorem?Architecture for linear regression with variable input where each input is n-sized one-hot encodedHow to prepare the data for text generation task

How to answer pointed "are you quitting" questioning when I don't want them to suspect

What is it called when one voice type sings a 'solo'?

How would photo IDs work for shapeshifters?

Denied boarding due to overcrowding, Sparpreis ticket. What are my rights?

Are white and non-white police officers equally likely to kill black suspects?

What happens when a metallic dragon and a chromatic dragon mate?

I see my dog run

Why doesn't a const reference extend the life of a temporary object passed via a function?

Dual Citizen. Exited the US on Italian passport recently

Need help identifying/translating a plaque in Tangier, Morocco

How to move the player while also allowing forces to affect it

Map list to bin numbers

Is a vector space a subspace of itself?

Cisco ASA 5585X Internal-Data0/1 interface errors

Why isn't airport relocation done gradually?

Short story: alien planet where slow students are executed

How can I fix this gap between bookcases I made?

Where to refill my bottle in India?

What causes the sudden spool-up sound from an F-16 when enabling afterburner?

Can I buy Tokyo Keisei line tickets with international debit card?

What is the meaning of "of trouble" in the following sentence?

OOB SharePoint Work Flow question: Difference between Workflow Task and Tasks (New)

If a centaur druid Wild Shapes into a Giant Elk, do their Charge features stack?

Can I legally use front facing blue light in the UK?



Why is MLP working similar to RNN for text generation



2019 Community Moderator Electionpython - What is the format of the WAV file for a Text to Speech Neural Network?how can I solve label shape problem in tensorflow when using one-hot encoding?Why are RNN/LSTM preferred in time series analysis and not other NN?Suspected Exploding Gradient in Character Generator LSTMUsing RNN (LSTM) for Gesture Recognition SystemTimeDistributed with different input / output sequence lengthNeed to make an multivariate RNN, confused about input shape?Can a neural network be created that will derive the pythogorean theorem?Architecture for linear regression with variable input where each input is n-sized one-hot encodedHow to prepare the data for text generation task










1












$begingroup$


I was trying to perform text generation using only a character level feed-forward neural network after having followed this tutorial which uses LSTM. I one-hot encoded the characters of my corpus which gave a vector of length 45. Then I concatenated every 20 characters and fed this 20*45 length vector as input to an MLP with the 21st character's one hot as the output.



Thus my X (input data) shape is -> (144304, 900)
and my Y (output data) shape is -> (144304, 45)



Here's the output from my code:




alice was beginning very about a grible thing was and bet she with a
great come and fill feel at and beck to the darcht repeat allice
waited it, put this was not an encir to the white knew the mock turtle
with a sigh. ‘i only took the regular cours was be crosd it to fits
some to see it was and getting she dodn as the endge of the evence,
and went on to love that you were no alway not--ohe f whow to the
gryphon is, who denight to goover and even stried to the dormouse, and
repeated her question. ‘why did they live at the bottom of a well?’



‘tabl the without once it it howling it the duchess to herself it as
eng, longing one of the door and wasting for the homend of the taits.’



‘sthing i cancus croquet with the queen to-day?’



‘i should like it very much,’ said the dryphon, ‘you first form into a
line along-the sea-shore--’



‘the right!’ cried the queen nother frowing tone. any the this her for
some thing is and at like the look of it at all,’ said the king:
‘however, it may kiss my hand if it likes.’



‘i’d really feeer that in a few this for some whele wish to get thing
to eager to think thcapered twice, and shook note bill herself in a
lell as expectant, and thowedd all have come fuconfuse it the march
hare: she thought it must be the right way of speaking to a mouse: she
had never done such a thing before, but she remembered having seen in
her brother’s latin grammar, ‘a mouse--of a mouse--to a mouse--a
mouse--o mister once to hin fent on the words with her fane with ale
three king the said, and diffich a vage and so mane alice this cime.




My Question is why is MLP working similar to an RNN/LSTM. What's the advantage of using RNN/LSTM for such tasks over MLP?










share|improve this question









$endgroup$
















    1












    $begingroup$


    I was trying to perform text generation using only a character level feed-forward neural network after having followed this tutorial which uses LSTM. I one-hot encoded the characters of my corpus which gave a vector of length 45. Then I concatenated every 20 characters and fed this 20*45 length vector as input to an MLP with the 21st character's one hot as the output.



    Thus my X (input data) shape is -> (144304, 900)
    and my Y (output data) shape is -> (144304, 45)



    Here's the output from my code:




    alice was beginning very about a grible thing was and bet she with a
    great come and fill feel at and beck to the darcht repeat allice
    waited it, put this was not an encir to the white knew the mock turtle
    with a sigh. ‘i only took the regular cours was be crosd it to fits
    some to see it was and getting she dodn as the endge of the evence,
    and went on to love that you were no alway not--ohe f whow to the
    gryphon is, who denight to goover and even stried to the dormouse, and
    repeated her question. ‘why did they live at the bottom of a well?’



    ‘tabl the without once it it howling it the duchess to herself it as
    eng, longing one of the door and wasting for the homend of the taits.’



    ‘sthing i cancus croquet with the queen to-day?’



    ‘i should like it very much,’ said the dryphon, ‘you first form into a
    line along-the sea-shore--’



    ‘the right!’ cried the queen nother frowing tone. any the this her for
    some thing is and at like the look of it at all,’ said the king:
    ‘however, it may kiss my hand if it likes.’



    ‘i’d really feeer that in a few this for some whele wish to get thing
    to eager to think thcapered twice, and shook note bill herself in a
    lell as expectant, and thowedd all have come fuconfuse it the march
    hare: she thought it must be the right way of speaking to a mouse: she
    had never done such a thing before, but she remembered having seen in
    her brother’s latin grammar, ‘a mouse--of a mouse--to a mouse--a
    mouse--o mister once to hin fent on the words with her fane with ale
    three king the said, and diffich a vage and so mane alice this cime.




    My Question is why is MLP working similar to an RNN/LSTM. What's the advantage of using RNN/LSTM for such tasks over MLP?










    share|improve this question









    $endgroup$














      1












      1








      1





      $begingroup$


      I was trying to perform text generation using only a character level feed-forward neural network after having followed this tutorial which uses LSTM. I one-hot encoded the characters of my corpus which gave a vector of length 45. Then I concatenated every 20 characters and fed this 20*45 length vector as input to an MLP with the 21st character's one hot as the output.



      Thus my X (input data) shape is -> (144304, 900)
      and my Y (output data) shape is -> (144304, 45)



      Here's the output from my code:




      alice was beginning very about a grible thing was and bet she with a
      great come and fill feel at and beck to the darcht repeat allice
      waited it, put this was not an encir to the white knew the mock turtle
      with a sigh. ‘i only took the regular cours was be crosd it to fits
      some to see it was and getting she dodn as the endge of the evence,
      and went on to love that you were no alway not--ohe f whow to the
      gryphon is, who denight to goover and even stried to the dormouse, and
      repeated her question. ‘why did they live at the bottom of a well?’



      ‘tabl the without once it it howling it the duchess to herself it as
      eng, longing one of the door and wasting for the homend of the taits.’



      ‘sthing i cancus croquet with the queen to-day?’



      ‘i should like it very much,’ said the dryphon, ‘you first form into a
      line along-the sea-shore--’



      ‘the right!’ cried the queen nother frowing tone. any the this her for
      some thing is and at like the look of it at all,’ said the king:
      ‘however, it may kiss my hand if it likes.’



      ‘i’d really feeer that in a few this for some whele wish to get thing
      to eager to think thcapered twice, and shook note bill herself in a
      lell as expectant, and thowedd all have come fuconfuse it the march
      hare: she thought it must be the right way of speaking to a mouse: she
      had never done such a thing before, but she remembered having seen in
      her brother’s latin grammar, ‘a mouse--of a mouse--to a mouse--a
      mouse--o mister once to hin fent on the words with her fane with ale
      three king the said, and diffich a vage and so mane alice this cime.




      My Question is why is MLP working similar to an RNN/LSTM. What's the advantage of using RNN/LSTM for such tasks over MLP?










      share|improve this question









      $endgroup$




      I was trying to perform text generation using only a character level feed-forward neural network after having followed this tutorial which uses LSTM. I one-hot encoded the characters of my corpus which gave a vector of length 45. Then I concatenated every 20 characters and fed this 20*45 length vector as input to an MLP with the 21st character's one hot as the output.



      Thus my X (input data) shape is -> (144304, 900)
      and my Y (output data) shape is -> (144304, 45)



      Here's the output from my code:




      alice was beginning very about a grible thing was and bet she with a
      great come and fill feel at and beck to the darcht repeat allice
      waited it, put this was not an encir to the white knew the mock turtle
      with a sigh. ‘i only took the regular cours was be crosd it to fits
      some to see it was and getting she dodn as the endge of the evence,
      and went on to love that you were no alway not--ohe f whow to the
      gryphon is, who denight to goover and even stried to the dormouse, and
      repeated her question. ‘why did they live at the bottom of a well?’



      ‘tabl the without once it it howling it the duchess to herself it as
      eng, longing one of the door and wasting for the homend of the taits.’



      ‘sthing i cancus croquet with the queen to-day?’



      ‘i should like it very much,’ said the dryphon, ‘you first form into a
      line along-the sea-shore--’



      ‘the right!’ cried the queen nother frowing tone. any the this her for
      some thing is and at like the look of it at all,’ said the king:
      ‘however, it may kiss my hand if it likes.’



      ‘i’d really feeer that in a few this for some whele wish to get thing
      to eager to think thcapered twice, and shook note bill herself in a
      lell as expectant, and thowedd all have come fuconfuse it the march
      hare: she thought it must be the right way of speaking to a mouse: she
      had never done such a thing before, but she remembered having seen in
      her brother’s latin grammar, ‘a mouse--of a mouse--to a mouse--a
      mouse--o mister once to hin fent on the words with her fane with ale
      three king the said, and diffich a vage and so mane alice this cime.




      My Question is why is MLP working similar to an RNN/LSTM. What's the advantage of using RNN/LSTM for such tasks over MLP?







      neural-network recurrent-neural-net natural-language-process language-model






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 28 at 17:46









      Atif HassanAtif Hassan

      1263




      1263




















          0






          active

          oldest

          votes












          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "557"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48165%2fwhy-is-mlp-working-similar-to-rnn-for-text-generation%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48165%2fwhy-is-mlp-working-similar-to-rnn-for-text-generation%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

          Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

          Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High