How to calculate which word fits the best given a context and possible words?2019 Community Moderator ElectionWord labeling with TensorflowConvert Word to Semantic PrimeHow can I find contextually related words and classify into custom tags/labels?what machine/deep learning/ nlp techniques are used to classify a given words as name, mobile number, address, email, state, county, city etcPlagiarism detection with PythonHow should I format input and output for text generation with LSTMsModel Joint Probability of N Words Appearing Together in a SentenceFind all potential similar documents out of a list of documents using clusteringsklearn Vectorizer (NLP task) : Generating Custom NGrams which are capable of scaling up for n >= 3

Anagram holiday

How to prevent "they're falling in love" trope

How can saying a song's name be a copyright violation?

Alternative to sending password over mail?

What killed these X2 caps?

Why doesn't using multiple commands with a || or && conditional work?

How do I find out when a node was added to an availability group?

Why does ы have a soft sign in it?

What does it mean to describe someone as a butt steak?

Is it possible to download Internet Explorer on my Mac running OS X El Capitan?

Avoiding the "not like other girls" trope?

Can I use a neutral wire from another outlet to repair a broken neutral?

Is "remove commented out code" correct English?

Is it canonical bit space?

Decimal to roman python

Is it inappropriate for a student to attend their mentor's dissertation defense?

Is it acceptable for a professor to tell male students to not think that they are smarter than female students?

What is the most common color to indicate the input-field is disabled?

What is the word for reserving something for yourself before others do?

Should I tell management that I intend to leave due to bad software development practices?

Why doesn't H₄O²⁺ exist?

If human space travel is limited by the G force vulnerability, is there a way to counter G forces?

Would Slavery Reparations be considered Bills of Attainder and hence Illegal?

Can I ask the recruiters in my resume to put the reason why I am rejected?



How to calculate which word fits the best given a context and possible words?



2019 Community Moderator ElectionWord labeling with TensorflowConvert Word to Semantic PrimeHow can I find contextually related words and classify into custom tags/labels?what machine/deep learning/ nlp techniques are used to classify a given words as name, mobile number, address, email, state, county, city etcPlagiarism detection with PythonHow should I format input and output for text generation with LSTMsModel Joint Probability of N Words Appearing Together in a SentenceFind all potential similar documents out of a list of documents using clusteringsklearn Vectorizer (NLP task) : Generating Custom NGrams which are capable of scaling up for n >= 3










0












$begingroup$


I have this task for research purposes and searched a while for a framework or a paper which already took care of this problem.



Unfortunately I don't find anything which helps me with my problem.



I have a sentence like



if the age of the applicant is **higher** than 18, then ...


and a list of words like



higher, bigger, greater, wider ...


which are all a



Now I want to find find out, which of the given words approximately fits the best at the predefined position in the sentence.



The best fitting word in this example would be 'greater', but for example 'higher' would be also fine.
In my specific case, I want to show an error message if someone would write 'wider', because this doesn't make sense in this semantic context.



I hope that I explained my problem good enough.










share|improve this question











$endgroup$











  • $begingroup$
    I think it would be helpful to answer your question if you could define "best fit" a bit more.
    $endgroup$
    – oW_
    Mar 26 at 15:22















0












$begingroup$


I have this task for research purposes and searched a while for a framework or a paper which already took care of this problem.



Unfortunately I don't find anything which helps me with my problem.



I have a sentence like



if the age of the applicant is **higher** than 18, then ...


and a list of words like



higher, bigger, greater, wider ...


which are all a



Now I want to find find out, which of the given words approximately fits the best at the predefined position in the sentence.



The best fitting word in this example would be 'greater', but for example 'higher' would be also fine.
In my specific case, I want to show an error message if someone would write 'wider', because this doesn't make sense in this semantic context.



I hope that I explained my problem good enough.










share|improve this question











$endgroup$











  • $begingroup$
    I think it would be helpful to answer your question if you could define "best fit" a bit more.
    $endgroup$
    – oW_
    Mar 26 at 15:22













0












0








0





$begingroup$


I have this task for research purposes and searched a while for a framework or a paper which already took care of this problem.



Unfortunately I don't find anything which helps me with my problem.



I have a sentence like



if the age of the applicant is **higher** than 18, then ...


and a list of words like



higher, bigger, greater, wider ...


which are all a



Now I want to find find out, which of the given words approximately fits the best at the predefined position in the sentence.



The best fitting word in this example would be 'greater', but for example 'higher' would be also fine.
In my specific case, I want to show an error message if someone would write 'wider', because this doesn't make sense in this semantic context.



I hope that I explained my problem good enough.










share|improve this question











$endgroup$




I have this task for research purposes and searched a while for a framework or a paper which already took care of this problem.



Unfortunately I don't find anything which helps me with my problem.



I have a sentence like



if the age of the applicant is **higher** than 18, then ...


and a list of words like



higher, bigger, greater, wider ...


which are all a



Now I want to find find out, which of the given words approximately fits the best at the predefined position in the sentence.



The best fitting word in this example would be 'greater', but for example 'higher' would be also fine.
In my specific case, I want to show an error message if someone would write 'wider', because this doesn't make sense in this semantic context.



I hope that I explained my problem good enough.







machine-learning nlp natural-language-process






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 27 at 7:52







user8238644

















asked Mar 26 at 12:41









user8238644user8238644

32




32











  • $begingroup$
    I think it would be helpful to answer your question if you could define "best fit" a bit more.
    $endgroup$
    – oW_
    Mar 26 at 15:22
















  • $begingroup$
    I think it would be helpful to answer your question if you could define "best fit" a bit more.
    $endgroup$
    – oW_
    Mar 26 at 15:22















$begingroup$
I think it would be helpful to answer your question if you could define "best fit" a bit more.
$endgroup$
– oW_
Mar 26 at 15:22




$begingroup$
I think it would be helpful to answer your question if you could define "best fit" a bit more.
$endgroup$
– oW_
Mar 26 at 15:22










1 Answer
1






active

oldest

votes


















0












$begingroup$

There are two options :



  1. CBOW . Modify Word2Vec CBOW code to save the whole trained model (current implementations only persist embedding layer)


CBOW Model: This method takes the context of each word as the input and tries to predict the word corresponding to the context.




Intro : https://towardsdatascience.com/introduction-to-word-embedding-and-word2vec-652d0c2060fa
Example : https://www.tensorflow.org/tutorials/representation/word2vec



  1. Train an LSTM / GRU to predict next word (given previous N words)

Karpathy's article is probably the best introduction to text generation with RNN (this works at character level, you will have to modify it to work at word level [Word-Vector level])



http://karpathy.github.io/2015/05/21/rnn-effectiveness/



Example :



https://medium.com/phrasee/neural-text-generation-generating-text-using-conditional-language-models-a37b69c7cd4b






share|improve this answer









$endgroup$












  • $begingroup$
    Thank you for your fast answer, I will have a look at both!
    $endgroup$
    – user8238644
    Mar 27 at 7:29











Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48024%2fhow-to-calculate-which-word-fits-the-best-given-a-context-and-possible-words%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

There are two options :



  1. CBOW . Modify Word2Vec CBOW code to save the whole trained model (current implementations only persist embedding layer)


CBOW Model: This method takes the context of each word as the input and tries to predict the word corresponding to the context.




Intro : https://towardsdatascience.com/introduction-to-word-embedding-and-word2vec-652d0c2060fa
Example : https://www.tensorflow.org/tutorials/representation/word2vec



  1. Train an LSTM / GRU to predict next word (given previous N words)

Karpathy's article is probably the best introduction to text generation with RNN (this works at character level, you will have to modify it to work at word level [Word-Vector level])



http://karpathy.github.io/2015/05/21/rnn-effectiveness/



Example :



https://medium.com/phrasee/neural-text-generation-generating-text-using-conditional-language-models-a37b69c7cd4b






share|improve this answer









$endgroup$












  • $begingroup$
    Thank you for your fast answer, I will have a look at both!
    $endgroup$
    – user8238644
    Mar 27 at 7:29















0












$begingroup$

There are two options :



  1. CBOW . Modify Word2Vec CBOW code to save the whole trained model (current implementations only persist embedding layer)


CBOW Model: This method takes the context of each word as the input and tries to predict the word corresponding to the context.




Intro : https://towardsdatascience.com/introduction-to-word-embedding-and-word2vec-652d0c2060fa
Example : https://www.tensorflow.org/tutorials/representation/word2vec



  1. Train an LSTM / GRU to predict next word (given previous N words)

Karpathy's article is probably the best introduction to text generation with RNN (this works at character level, you will have to modify it to work at word level [Word-Vector level])



http://karpathy.github.io/2015/05/21/rnn-effectiveness/



Example :



https://medium.com/phrasee/neural-text-generation-generating-text-using-conditional-language-models-a37b69c7cd4b






share|improve this answer









$endgroup$












  • $begingroup$
    Thank you for your fast answer, I will have a look at both!
    $endgroup$
    – user8238644
    Mar 27 at 7:29













0












0








0





$begingroup$

There are two options :



  1. CBOW . Modify Word2Vec CBOW code to save the whole trained model (current implementations only persist embedding layer)


CBOW Model: This method takes the context of each word as the input and tries to predict the word corresponding to the context.




Intro : https://towardsdatascience.com/introduction-to-word-embedding-and-word2vec-652d0c2060fa
Example : https://www.tensorflow.org/tutorials/representation/word2vec



  1. Train an LSTM / GRU to predict next word (given previous N words)

Karpathy's article is probably the best introduction to text generation with RNN (this works at character level, you will have to modify it to work at word level [Word-Vector level])



http://karpathy.github.io/2015/05/21/rnn-effectiveness/



Example :



https://medium.com/phrasee/neural-text-generation-generating-text-using-conditional-language-models-a37b69c7cd4b






share|improve this answer









$endgroup$



There are two options :



  1. CBOW . Modify Word2Vec CBOW code to save the whole trained model (current implementations only persist embedding layer)


CBOW Model: This method takes the context of each word as the input and tries to predict the word corresponding to the context.




Intro : https://towardsdatascience.com/introduction-to-word-embedding-and-word2vec-652d0c2060fa
Example : https://www.tensorflow.org/tutorials/representation/word2vec



  1. Train an LSTM / GRU to predict next word (given previous N words)

Karpathy's article is probably the best introduction to text generation with RNN (this works at character level, you will have to modify it to work at word level [Word-Vector level])



http://karpathy.github.io/2015/05/21/rnn-effectiveness/



Example :



https://medium.com/phrasee/neural-text-generation-generating-text-using-conditional-language-models-a37b69c7cd4b







share|improve this answer












share|improve this answer



share|improve this answer










answered Mar 26 at 12:58









Shamit VermaShamit Verma

1,3191214




1,3191214











  • $begingroup$
    Thank you for your fast answer, I will have a look at both!
    $endgroup$
    – user8238644
    Mar 27 at 7:29
















  • $begingroup$
    Thank you for your fast answer, I will have a look at both!
    $endgroup$
    – user8238644
    Mar 27 at 7:29















$begingroup$
Thank you for your fast answer, I will have a look at both!
$endgroup$
– user8238644
Mar 27 at 7:29




$begingroup$
Thank you for your fast answer, I will have a look at both!
$endgroup$
– user8238644
Mar 27 at 7:29

















draft saved

draft discarded
















































Thanks for contributing an answer to Data Science Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48024%2fhow-to-calculate-which-word-fits-the-best-given-a-context-and-possible-words%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High