What is the difference between “offline trained model” and “pretrained model”?2019 Community Moderator ElectionNewbie: What is the difference between hypothesis class and models?What is the difference between model hyperparameters and model parameters?What is the difference between data analysis and machine learning?What is the different between Fine-tuning and Transfer-learning?What is the difference between Dilated Convolution and Deconvolution?What is the difference between statistical learning and predictive analytics?What is the difference between observation and variable?Difference between advantages of Experience Replay in DQN2013 paperWhich is the fastest image pretrained model?What is the difference between parameters & cooficients in Machine learning?
Can compressed videos be decoded back to their uncompresed original format?
What Exploit Are These User Agents Trying to Use?
Is it "common practice in Fourier transform spectroscopy to multiply the measured interferogram by an apodizing function"? If so, why?
How do I deal with an unproductive colleague in a small company?
Forgetting the musical notes while performing in concert
files created then deleted at every second in tmp directory
In 'Revenger,' what does 'cove' come from?
Detention in 1997
Rotate ASCII Art by 45 Degrees
Are British MPs missing the point, with these 'Indicative Votes'?
ssTTsSTtRrriinInnnnNNNIiinngg
Does Dispel Magic work on Tiny Hut?
What do you call someone who asks many questions?
Why do I get negative height?
Fair gambler's ruin problem intuition
What reasons are there for a Capitalist to oppose a 100% inheritance tax?
Does the Idaho Potato Commission associate potato skins with healthy eating?
Avoiding the "not like other girls" trope?
Is it possible for a PC to dismember a humanoid?
How to travel to Japan while expressing milk?
How to compactly explain secondary and tertiary characters without resorting to stereotypes?
Is it possible to static_assert that a lambda is not generic?
Migrate from Force.com IDE to VScode
GFCI outlets - can they be repaired? Are they really needed at the end of a circuit?
What is the difference between “offline trained model” and “pretrained model”?
2019 Community Moderator ElectionNewbie: What is the difference between hypothesis class and models?What is the difference between model hyperparameters and model parameters?What is the difference between data analysis and machine learning?What is the different between Fine-tuning and Transfer-learning?What is the difference between Dilated Convolution and Deconvolution?What is the difference between statistical learning and predictive analytics?What is the difference between observation and variable?Difference between advantages of Experience Replay in DQN2013 paperWhich is the fastest image pretrained model?What is the difference between parameters & cooficients in Machine learning?
$begingroup$
I am confused that both are same or not, and then how can I differentiate with the online training model.
machine-learning deep-learning computer-vision image-recognition
New contributor
$endgroup$
add a comment |
$begingroup$
I am confused that both are same or not, and then how can I differentiate with the online training model.
machine-learning deep-learning computer-vision image-recognition
New contributor
$endgroup$
$begingroup$
Pretrained models are models which are trained on larger datasets along with high computation power. They tend to have high accuracies and have a higher level of generalisation. Offline models could be models which are trained on a local machine and not on a server or a cloud.
$endgroup$
– Shubham Panchal
Mar 28 at 8:10
add a comment |
$begingroup$
I am confused that both are same or not, and then how can I differentiate with the online training model.
machine-learning deep-learning computer-vision image-recognition
New contributor
$endgroup$
I am confused that both are same or not, and then how can I differentiate with the online training model.
machine-learning deep-learning computer-vision image-recognition
machine-learning deep-learning computer-vision image-recognition
New contributor
New contributor
New contributor
asked Mar 28 at 6:46
Md. Maklachur RahmanMd. Maklachur Rahman
61
61
New contributor
New contributor
$begingroup$
Pretrained models are models which are trained on larger datasets along with high computation power. They tend to have high accuracies and have a higher level of generalisation. Offline models could be models which are trained on a local machine and not on a server or a cloud.
$endgroup$
– Shubham Panchal
Mar 28 at 8:10
add a comment |
$begingroup$
Pretrained models are models which are trained on larger datasets along with high computation power. They tend to have high accuracies and have a higher level of generalisation. Offline models could be models which are trained on a local machine and not on a server or a cloud.
$endgroup$
– Shubham Panchal
Mar 28 at 8:10
$begingroup$
Pretrained models are models which are trained on larger datasets along with high computation power. They tend to have high accuracies and have a higher level of generalisation. Offline models could be models which are trained on a local machine and not on a server or a cloud.
$endgroup$
– Shubham Panchal
Mar 28 at 8:10
$begingroup$
Pretrained models are models which are trained on larger datasets along with high computation power. They tend to have high accuracies and have a higher level of generalisation. Offline models could be models which are trained on a local machine and not on a server or a cloud.
$endgroup$
– Shubham Panchal
Mar 28 at 8:10
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Every pre-trained model is an offline-trained model, but not the
reverse.
Offline training is any training that leaves the model unchanged when new observations arrive, i.e. it has an end. Online training constantly updates the model with the help of new, incoming observations without using the previous training points, although, having a limited memory of previous samples compared to all seen samples is OK. Therefore, training offline periodically on all the training points, no matter how frequent, is not the same as online training.
- We can use offline training for a model that supports online training, however, to train online, the model must allow such training. For example, common implementation of SVM cannot be trained on N new data points without re-using the previous training points. In contrast, Bayesian models are natural candidates for online learning, as they are trained by updating the belief (model) upon new observations, i.e. posterior update.
Pre-training is an offline training followed by a main, task-specific training, hence the prefix "pre".
For example, a $512 times 512 times 3 rightarrow 128$ dimensionality reduction model is pre-trained on a large dataset of RGB images (either supervised, or self-supervised). Then, the pre-trained model is used to reduce the dimension of our [new] images to $128$, which is then being fed to the main, task-specific model.
Note that the word "self-supervised" (supervised by input itself) is currently used for models that try to reconstruct, or predict the whole, or part of the input as close as possible; e.g., auto-encoders, or some language models such as Word2Vec.
$endgroup$
add a comment |
$begingroup$
Online Model :
Model that continuously learns in production. If 10 new training samples are available, we do not need to retrain with all previous samples.
Pre-trained model
Model has already been trained on large data sets. This is a useful technique via transfer learning.
https://www.slideshare.net/queirozfcom/online-machine-learning-introduction-and-examples
https://www.analyticsvidhya.com/blog/2015/01/introduction-online-machine-learning-simplified-2/
https://en.wikipedia.org/wiki/Online_machine_learning
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Md. Maklachur Rahman is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48127%2fwhat-is-the-difference-between-offline-trained-model-and-pretrained-model%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Every pre-trained model is an offline-trained model, but not the
reverse.
Offline training is any training that leaves the model unchanged when new observations arrive, i.e. it has an end. Online training constantly updates the model with the help of new, incoming observations without using the previous training points, although, having a limited memory of previous samples compared to all seen samples is OK. Therefore, training offline periodically on all the training points, no matter how frequent, is not the same as online training.
- We can use offline training for a model that supports online training, however, to train online, the model must allow such training. For example, common implementation of SVM cannot be trained on N new data points without re-using the previous training points. In contrast, Bayesian models are natural candidates for online learning, as they are trained by updating the belief (model) upon new observations, i.e. posterior update.
Pre-training is an offline training followed by a main, task-specific training, hence the prefix "pre".
For example, a $512 times 512 times 3 rightarrow 128$ dimensionality reduction model is pre-trained on a large dataset of RGB images (either supervised, or self-supervised). Then, the pre-trained model is used to reduce the dimension of our [new] images to $128$, which is then being fed to the main, task-specific model.
Note that the word "self-supervised" (supervised by input itself) is currently used for models that try to reconstruct, or predict the whole, or part of the input as close as possible; e.g., auto-encoders, or some language models such as Word2Vec.
$endgroup$
add a comment |
$begingroup$
Every pre-trained model is an offline-trained model, but not the
reverse.
Offline training is any training that leaves the model unchanged when new observations arrive, i.e. it has an end. Online training constantly updates the model with the help of new, incoming observations without using the previous training points, although, having a limited memory of previous samples compared to all seen samples is OK. Therefore, training offline periodically on all the training points, no matter how frequent, is not the same as online training.
- We can use offline training for a model that supports online training, however, to train online, the model must allow such training. For example, common implementation of SVM cannot be trained on N new data points without re-using the previous training points. In contrast, Bayesian models are natural candidates for online learning, as they are trained by updating the belief (model) upon new observations, i.e. posterior update.
Pre-training is an offline training followed by a main, task-specific training, hence the prefix "pre".
For example, a $512 times 512 times 3 rightarrow 128$ dimensionality reduction model is pre-trained on a large dataset of RGB images (either supervised, or self-supervised). Then, the pre-trained model is used to reduce the dimension of our [new] images to $128$, which is then being fed to the main, task-specific model.
Note that the word "self-supervised" (supervised by input itself) is currently used for models that try to reconstruct, or predict the whole, or part of the input as close as possible; e.g., auto-encoders, or some language models such as Word2Vec.
$endgroup$
add a comment |
$begingroup$
Every pre-trained model is an offline-trained model, but not the
reverse.
Offline training is any training that leaves the model unchanged when new observations arrive, i.e. it has an end. Online training constantly updates the model with the help of new, incoming observations without using the previous training points, although, having a limited memory of previous samples compared to all seen samples is OK. Therefore, training offline periodically on all the training points, no matter how frequent, is not the same as online training.
- We can use offline training for a model that supports online training, however, to train online, the model must allow such training. For example, common implementation of SVM cannot be trained on N new data points without re-using the previous training points. In contrast, Bayesian models are natural candidates for online learning, as they are trained by updating the belief (model) upon new observations, i.e. posterior update.
Pre-training is an offline training followed by a main, task-specific training, hence the prefix "pre".
For example, a $512 times 512 times 3 rightarrow 128$ dimensionality reduction model is pre-trained on a large dataset of RGB images (either supervised, or self-supervised). Then, the pre-trained model is used to reduce the dimension of our [new] images to $128$, which is then being fed to the main, task-specific model.
Note that the word "self-supervised" (supervised by input itself) is currently used for models that try to reconstruct, or predict the whole, or part of the input as close as possible; e.g., auto-encoders, or some language models such as Word2Vec.
$endgroup$
Every pre-trained model is an offline-trained model, but not the
reverse.
Offline training is any training that leaves the model unchanged when new observations arrive, i.e. it has an end. Online training constantly updates the model with the help of new, incoming observations without using the previous training points, although, having a limited memory of previous samples compared to all seen samples is OK. Therefore, training offline periodically on all the training points, no matter how frequent, is not the same as online training.
- We can use offline training for a model that supports online training, however, to train online, the model must allow such training. For example, common implementation of SVM cannot be trained on N new data points without re-using the previous training points. In contrast, Bayesian models are natural candidates for online learning, as they are trained by updating the belief (model) upon new observations, i.e. posterior update.
Pre-training is an offline training followed by a main, task-specific training, hence the prefix "pre".
For example, a $512 times 512 times 3 rightarrow 128$ dimensionality reduction model is pre-trained on a large dataset of RGB images (either supervised, or self-supervised). Then, the pre-trained model is used to reduce the dimension of our [new] images to $128$, which is then being fed to the main, task-specific model.
Note that the word "self-supervised" (supervised by input itself) is currently used for models that try to reconstruct, or predict the whole, or part of the input as close as possible; e.g., auto-encoders, or some language models such as Word2Vec.
edited Mar 29 at 9:10
answered Mar 28 at 13:37
EsmailianEsmailian
2,536318
2,536318
add a comment |
add a comment |
$begingroup$
Online Model :
Model that continuously learns in production. If 10 new training samples are available, we do not need to retrain with all previous samples.
Pre-trained model
Model has already been trained on large data sets. This is a useful technique via transfer learning.
https://www.slideshare.net/queirozfcom/online-machine-learning-introduction-and-examples
https://www.analyticsvidhya.com/blog/2015/01/introduction-online-machine-learning-simplified-2/
https://en.wikipedia.org/wiki/Online_machine_learning
$endgroup$
add a comment |
$begingroup$
Online Model :
Model that continuously learns in production. If 10 new training samples are available, we do not need to retrain with all previous samples.
Pre-trained model
Model has already been trained on large data sets. This is a useful technique via transfer learning.
https://www.slideshare.net/queirozfcom/online-machine-learning-introduction-and-examples
https://www.analyticsvidhya.com/blog/2015/01/introduction-online-machine-learning-simplified-2/
https://en.wikipedia.org/wiki/Online_machine_learning
$endgroup$
add a comment |
$begingroup$
Online Model :
Model that continuously learns in production. If 10 new training samples are available, we do not need to retrain with all previous samples.
Pre-trained model
Model has already been trained on large data sets. This is a useful technique via transfer learning.
https://www.slideshare.net/queirozfcom/online-machine-learning-introduction-and-examples
https://www.analyticsvidhya.com/blog/2015/01/introduction-online-machine-learning-simplified-2/
https://en.wikipedia.org/wiki/Online_machine_learning
$endgroup$
Online Model :
Model that continuously learns in production. If 10 new training samples are available, we do not need to retrain with all previous samples.
Pre-trained model
Model has already been trained on large data sets. This is a useful technique via transfer learning.
https://www.slideshare.net/queirozfcom/online-machine-learning-introduction-and-examples
https://www.analyticsvidhya.com/blog/2015/01/introduction-online-machine-learning-simplified-2/
https://en.wikipedia.org/wiki/Online_machine_learning
answered Mar 28 at 9:52
Shamit VermaShamit Verma
1,1491212
1,1491212
add a comment |
add a comment |
Md. Maklachur Rahman is a new contributor. Be nice, and check out our Code of Conduct.
Md. Maklachur Rahman is a new contributor. Be nice, and check out our Code of Conduct.
Md. Maklachur Rahman is a new contributor. Be nice, and check out our Code of Conduct.
Md. Maklachur Rahman is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48127%2fwhat-is-the-difference-between-offline-trained-model-and-pretrained-model%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Pretrained models are models which are trained on larger datasets along with high computation power. They tend to have high accuracies and have a higher level of generalisation. Offline models could be models which are trained on a local machine and not on a server or a cloud.
$endgroup$
– Shubham Panchal
Mar 28 at 8:10