Why Softmax is not selecting with maximum probability?
I am taking lectures of course CS231 from Stanford university. I am unable to understand the point from RNN, Why Softmax unable to select the highest probability which is 0.84 for character o (in the attached example) instead of 0.13 for character e. Explanation will be highly appreciated.
deep-learning probability rnn softmax
add a comment |
I am taking lectures of course CS231 from Stanford university. I am unable to understand the point from RNN, Why Softmax unable to select the highest probability which is 0.84 for character o (in the attached example) instead of 0.13 for character e. Explanation will be highly appreciated.
deep-learning probability rnn softmax
Its probably a typo or mistake in the slides.
– Matias Valdenegro
Nov 21 '18 at 15:43
@MatiasValdenegro I disagree , also it's not only producing 'e' incorrectly but first 'l' as well.
– Khalid Usman
Nov 21 '18 at 16:08
You disagree? And what evidence you have of this? Its way more likely that the authors just made a mistake, as any other person. Anyways, what is the programming question here?
– Matias Valdenegro
Nov 21 '18 at 16:11
@MatiasValdenegro I am sorry I couldn't explained well, but i watched video lectures and the instructor mentioned it, but i am unable to understand from his explanation. Therefore I posted here to get better understanding of RNN and later implement.
– Khalid Usman
Nov 21 '18 at 23:31
add a comment |
I am taking lectures of course CS231 from Stanford university. I am unable to understand the point from RNN, Why Softmax unable to select the highest probability which is 0.84 for character o (in the attached example) instead of 0.13 for character e. Explanation will be highly appreciated.
deep-learning probability rnn softmax
I am taking lectures of course CS231 from Stanford university. I am unable to understand the point from RNN, Why Softmax unable to select the highest probability which is 0.84 for character o (in the attached example) instead of 0.13 for character e. Explanation will be highly appreciated.
deep-learning probability rnn softmax
deep-learning probability rnn softmax
edited Nov 21 '18 at 14:49
asked Nov 21 '18 at 14:46
Khalid Usman
1,08211229
1,08211229
Its probably a typo or mistake in the slides.
– Matias Valdenegro
Nov 21 '18 at 15:43
@MatiasValdenegro I disagree , also it's not only producing 'e' incorrectly but first 'l' as well.
– Khalid Usman
Nov 21 '18 at 16:08
You disagree? And what evidence you have of this? Its way more likely that the authors just made a mistake, as any other person. Anyways, what is the programming question here?
– Matias Valdenegro
Nov 21 '18 at 16:11
@MatiasValdenegro I am sorry I couldn't explained well, but i watched video lectures and the instructor mentioned it, but i am unable to understand from his explanation. Therefore I posted here to get better understanding of RNN and later implement.
– Khalid Usman
Nov 21 '18 at 23:31
add a comment |
Its probably a typo or mistake in the slides.
– Matias Valdenegro
Nov 21 '18 at 15:43
@MatiasValdenegro I disagree , also it's not only producing 'e' incorrectly but first 'l' as well.
– Khalid Usman
Nov 21 '18 at 16:08
You disagree? And what evidence you have of this? Its way more likely that the authors just made a mistake, as any other person. Anyways, what is the programming question here?
– Matias Valdenegro
Nov 21 '18 at 16:11
@MatiasValdenegro I am sorry I couldn't explained well, but i watched video lectures and the instructor mentioned it, but i am unable to understand from his explanation. Therefore I posted here to get better understanding of RNN and later implement.
– Khalid Usman
Nov 21 '18 at 23:31
Its probably a typo or mistake in the slides.
– Matias Valdenegro
Nov 21 '18 at 15:43
Its probably a typo or mistake in the slides.
– Matias Valdenegro
Nov 21 '18 at 15:43
@MatiasValdenegro I disagree , also it's not only producing 'e' incorrectly but first 'l' as well.
– Khalid Usman
Nov 21 '18 at 16:08
@MatiasValdenegro I disagree , also it's not only producing 'e' incorrectly but first 'l' as well.
– Khalid Usman
Nov 21 '18 at 16:08
You disagree? And what evidence you have of this? Its way more likely that the authors just made a mistake, as any other person. Anyways, what is the programming question here?
– Matias Valdenegro
Nov 21 '18 at 16:11
You disagree? And what evidence you have of this? Its way more likely that the authors just made a mistake, as any other person. Anyways, what is the programming question here?
– Matias Valdenegro
Nov 21 '18 at 16:11
@MatiasValdenegro I am sorry I couldn't explained well, but i watched video lectures and the instructor mentioned it, but i am unable to understand from his explanation. Therefore I posted here to get better understanding of RNN and later implement.
– Khalid Usman
Nov 21 '18 at 23:31
@MatiasValdenegro I am sorry I couldn't explained well, but i watched video lectures and the instructor mentioned it, but i am unable to understand from his explanation. Therefore I posted here to get better understanding of RNN and later implement.
– Khalid Usman
Nov 21 '18 at 23:31
add a comment |
1 Answer
1
active
oldest
votes
I have not really watched the lecture, but I think the 'e' at the top is the expected output(and 'l', 'l', 'o' too). The initial weights are not giving good enough results (giving 'o' instead of 'e'). As you train the network, the weights will become more mature and ultimately you will see the change in probabilities and the the first prediction will result in 'e' ultimately
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53414580%2fwhy-softmax-is-not-selecting-with-maximum-probability%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I have not really watched the lecture, but I think the 'e' at the top is the expected output(and 'l', 'l', 'o' too). The initial weights are not giving good enough results (giving 'o' instead of 'e'). As you train the network, the weights will become more mature and ultimately you will see the change in probabilities and the the first prediction will result in 'e' ultimately
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
add a comment |
I have not really watched the lecture, but I think the 'e' at the top is the expected output(and 'l', 'l', 'o' too). The initial weights are not giving good enough results (giving 'o' instead of 'e'). As you train the network, the weights will become more mature and ultimately you will see the change in probabilities and the the first prediction will result in 'e' ultimately
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
add a comment |
I have not really watched the lecture, but I think the 'e' at the top is the expected output(and 'l', 'l', 'o' too). The initial weights are not giving good enough results (giving 'o' instead of 'e'). As you train the network, the weights will become more mature and ultimately you will see the change in probabilities and the the first prediction will result in 'e' ultimately
I have not really watched the lecture, but I think the 'e' at the top is the expected output(and 'l', 'l', 'o' too). The initial weights are not giving good enough results (giving 'o' instead of 'e'). As you train the network, the weights will become more mature and ultimately you will see the change in probabilities and the the first prediction will result in 'e' ultimately
answered Nov 23 '18 at 12:41
Biswadip Mandal
1809
1809
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
add a comment |
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
Yes, you are right in general, but here instructor mentioned it's sampling strategy instead of probability. Thanks
– Khalid Usman
Nov 24 '18 at 13:41
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53414580%2fwhy-softmax-is-not-selecting-with-maximum-probability%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Its probably a typo or mistake in the slides.
– Matias Valdenegro
Nov 21 '18 at 15:43
@MatiasValdenegro I disagree , also it's not only producing 'e' incorrectly but first 'l' as well.
– Khalid Usman
Nov 21 '18 at 16:08
You disagree? And what evidence you have of this? Its way more likely that the authors just made a mistake, as any other person. Anyways, what is the programming question here?
– Matias Valdenegro
Nov 21 '18 at 16:11
@MatiasValdenegro I am sorry I couldn't explained well, but i watched video lectures and the instructor mentioned it, but i am unable to understand from his explanation. Therefore I posted here to get better understanding of RNN and later implement.
– Khalid Usman
Nov 21 '18 at 23:31