Mean of value by days












1














I have the follow dataset



                   dVal              eVal
0 2015-01-01 00:00:00.000 3.622833
1 2015-01-01 01:00:00.000 3.501333
2 2015-01-01 02:00:00.000 3.469167
3 2015-01-01 03:00:00.000 3.436333
4 2015-01-01 04:00:00.000 3.428000
5 2015-01-01 05:00:00.000 3.400667
6 2015-01-01 06:00:00.000 3.405667
7 2015-01-01 07:00:00.000 3.401500
8 2015-01-01 08:00:00.000 3.404333
9 2015-01-01 09:00:00.000 3.424833
10 2015-01-01 10:00:00.000 3.489500
11 2015-01-01 11:00:00.000 3.521000
12 2015-01-01 12:00:00.000 3.527833
13 2015-01-01 13:00:00.000 3.523500
14 2015-01-01 14:00:00.000 3.511667
15 2015-01-01 15:00:00.000 3.602500
16 2015-01-01 16:00:00.000 3.657667
17 2015-01-01 17:00:00.000 3.616667
18 2015-01-01 18:00:00.000 3.534500
19 2015-01-01 19:00:00.000 3.529167
20 2015-01-01 20:00:00.000 3.548167
21 2015-01-01 21:00:00.000 3.565500
22 2015-01-01 22:00:00.000 3.539833
23 2015-01-01 23:00:00.000 3.485667
24 2015-01-02 00:00:00.000 3.493167
.........
.........


I want do a mean, by day, of the column eVal.
First step if transform the dVal column to datetime.



time['dVal'] = pd.to_datetime(time['dVal'])


Next I set the datetime column as the index



time.index = time['dVal']


Finally I count mean for each day



me = time.resample('D').mean()


The mean calculated it's wrong.



dVal         eVal
2015-01-01 4.014973 --> The correct mean of the first day is 3.5
2015-01-02 4.006548
2015-01-03 4.010406
2015-01-04 4.034531









share|improve this question






















  • works fine for me eVal dVal 2015-01-01 3.506160 2015-01-02 3.493167
    – RomanPerekhrest
    Nov 21 at 9:52












  • You follow my step? Your eVal column is a float like mine? If you try min() or max() function, give you the correct resoult? Bacause I have a problem with this two others functions
    – jjgasse
    Nov 21 at 9:57












  • dVal datetime64[ns] eVal float64 dtype: object
    – RomanPerekhrest
    Nov 21 at 10:00










  • I try to use different function but i don't arrive to the correct result. I try to use: time.set_index('dVal').groupby(pd.TimeGrouper('D')).mean().dropna() but it give me the same result above. The dtype is the same yours
    – jjgasse
    Nov 22 at 9:16












  • If I use only me = time.resample('D') and next I do a describe function for understand the situation in general me.describe(), all the value are wrog ( like max, min,mean).
    – jjgasse
    Nov 27 at 9:49


















1














I have the follow dataset



                   dVal              eVal
0 2015-01-01 00:00:00.000 3.622833
1 2015-01-01 01:00:00.000 3.501333
2 2015-01-01 02:00:00.000 3.469167
3 2015-01-01 03:00:00.000 3.436333
4 2015-01-01 04:00:00.000 3.428000
5 2015-01-01 05:00:00.000 3.400667
6 2015-01-01 06:00:00.000 3.405667
7 2015-01-01 07:00:00.000 3.401500
8 2015-01-01 08:00:00.000 3.404333
9 2015-01-01 09:00:00.000 3.424833
10 2015-01-01 10:00:00.000 3.489500
11 2015-01-01 11:00:00.000 3.521000
12 2015-01-01 12:00:00.000 3.527833
13 2015-01-01 13:00:00.000 3.523500
14 2015-01-01 14:00:00.000 3.511667
15 2015-01-01 15:00:00.000 3.602500
16 2015-01-01 16:00:00.000 3.657667
17 2015-01-01 17:00:00.000 3.616667
18 2015-01-01 18:00:00.000 3.534500
19 2015-01-01 19:00:00.000 3.529167
20 2015-01-01 20:00:00.000 3.548167
21 2015-01-01 21:00:00.000 3.565500
22 2015-01-01 22:00:00.000 3.539833
23 2015-01-01 23:00:00.000 3.485667
24 2015-01-02 00:00:00.000 3.493167
.........
.........


I want do a mean, by day, of the column eVal.
First step if transform the dVal column to datetime.



time['dVal'] = pd.to_datetime(time['dVal'])


Next I set the datetime column as the index



time.index = time['dVal']


Finally I count mean for each day



me = time.resample('D').mean()


The mean calculated it's wrong.



dVal         eVal
2015-01-01 4.014973 --> The correct mean of the first day is 3.5
2015-01-02 4.006548
2015-01-03 4.010406
2015-01-04 4.034531









share|improve this question






















  • works fine for me eVal dVal 2015-01-01 3.506160 2015-01-02 3.493167
    – RomanPerekhrest
    Nov 21 at 9:52












  • You follow my step? Your eVal column is a float like mine? If you try min() or max() function, give you the correct resoult? Bacause I have a problem with this two others functions
    – jjgasse
    Nov 21 at 9:57












  • dVal datetime64[ns] eVal float64 dtype: object
    – RomanPerekhrest
    Nov 21 at 10:00










  • I try to use different function but i don't arrive to the correct result. I try to use: time.set_index('dVal').groupby(pd.TimeGrouper('D')).mean().dropna() but it give me the same result above. The dtype is the same yours
    – jjgasse
    Nov 22 at 9:16












  • If I use only me = time.resample('D') and next I do a describe function for understand the situation in general me.describe(), all the value are wrog ( like max, min,mean).
    – jjgasse
    Nov 27 at 9:49
















1












1








1







I have the follow dataset



                   dVal              eVal
0 2015-01-01 00:00:00.000 3.622833
1 2015-01-01 01:00:00.000 3.501333
2 2015-01-01 02:00:00.000 3.469167
3 2015-01-01 03:00:00.000 3.436333
4 2015-01-01 04:00:00.000 3.428000
5 2015-01-01 05:00:00.000 3.400667
6 2015-01-01 06:00:00.000 3.405667
7 2015-01-01 07:00:00.000 3.401500
8 2015-01-01 08:00:00.000 3.404333
9 2015-01-01 09:00:00.000 3.424833
10 2015-01-01 10:00:00.000 3.489500
11 2015-01-01 11:00:00.000 3.521000
12 2015-01-01 12:00:00.000 3.527833
13 2015-01-01 13:00:00.000 3.523500
14 2015-01-01 14:00:00.000 3.511667
15 2015-01-01 15:00:00.000 3.602500
16 2015-01-01 16:00:00.000 3.657667
17 2015-01-01 17:00:00.000 3.616667
18 2015-01-01 18:00:00.000 3.534500
19 2015-01-01 19:00:00.000 3.529167
20 2015-01-01 20:00:00.000 3.548167
21 2015-01-01 21:00:00.000 3.565500
22 2015-01-01 22:00:00.000 3.539833
23 2015-01-01 23:00:00.000 3.485667
24 2015-01-02 00:00:00.000 3.493167
.........
.........


I want do a mean, by day, of the column eVal.
First step if transform the dVal column to datetime.



time['dVal'] = pd.to_datetime(time['dVal'])


Next I set the datetime column as the index



time.index = time['dVal']


Finally I count mean for each day



me = time.resample('D').mean()


The mean calculated it's wrong.



dVal         eVal
2015-01-01 4.014973 --> The correct mean of the first day is 3.5
2015-01-02 4.006548
2015-01-03 4.010406
2015-01-04 4.034531









share|improve this question













I have the follow dataset



                   dVal              eVal
0 2015-01-01 00:00:00.000 3.622833
1 2015-01-01 01:00:00.000 3.501333
2 2015-01-01 02:00:00.000 3.469167
3 2015-01-01 03:00:00.000 3.436333
4 2015-01-01 04:00:00.000 3.428000
5 2015-01-01 05:00:00.000 3.400667
6 2015-01-01 06:00:00.000 3.405667
7 2015-01-01 07:00:00.000 3.401500
8 2015-01-01 08:00:00.000 3.404333
9 2015-01-01 09:00:00.000 3.424833
10 2015-01-01 10:00:00.000 3.489500
11 2015-01-01 11:00:00.000 3.521000
12 2015-01-01 12:00:00.000 3.527833
13 2015-01-01 13:00:00.000 3.523500
14 2015-01-01 14:00:00.000 3.511667
15 2015-01-01 15:00:00.000 3.602500
16 2015-01-01 16:00:00.000 3.657667
17 2015-01-01 17:00:00.000 3.616667
18 2015-01-01 18:00:00.000 3.534500
19 2015-01-01 19:00:00.000 3.529167
20 2015-01-01 20:00:00.000 3.548167
21 2015-01-01 21:00:00.000 3.565500
22 2015-01-01 22:00:00.000 3.539833
23 2015-01-01 23:00:00.000 3.485667
24 2015-01-02 00:00:00.000 3.493167
.........
.........


I want do a mean, by day, of the column eVal.
First step if transform the dVal column to datetime.



time['dVal'] = pd.to_datetime(time['dVal'])


Next I set the datetime column as the index



time.index = time['dVal']


Finally I count mean for each day



me = time.resample('D').mean()


The mean calculated it's wrong.



dVal         eVal
2015-01-01 4.014973 --> The correct mean of the first day is 3.5
2015-01-02 4.006548
2015-01-03 4.010406
2015-01-04 4.034531






python-3.x time pandas-groupby






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 21 at 9:04









jjgasse

848




848












  • works fine for me eVal dVal 2015-01-01 3.506160 2015-01-02 3.493167
    – RomanPerekhrest
    Nov 21 at 9:52












  • You follow my step? Your eVal column is a float like mine? If you try min() or max() function, give you the correct resoult? Bacause I have a problem with this two others functions
    – jjgasse
    Nov 21 at 9:57












  • dVal datetime64[ns] eVal float64 dtype: object
    – RomanPerekhrest
    Nov 21 at 10:00










  • I try to use different function but i don't arrive to the correct result. I try to use: time.set_index('dVal').groupby(pd.TimeGrouper('D')).mean().dropna() but it give me the same result above. The dtype is the same yours
    – jjgasse
    Nov 22 at 9:16












  • If I use only me = time.resample('D') and next I do a describe function for understand the situation in general me.describe(), all the value are wrog ( like max, min,mean).
    – jjgasse
    Nov 27 at 9:49




















  • works fine for me eVal dVal 2015-01-01 3.506160 2015-01-02 3.493167
    – RomanPerekhrest
    Nov 21 at 9:52












  • You follow my step? Your eVal column is a float like mine? If you try min() or max() function, give you the correct resoult? Bacause I have a problem with this two others functions
    – jjgasse
    Nov 21 at 9:57












  • dVal datetime64[ns] eVal float64 dtype: object
    – RomanPerekhrest
    Nov 21 at 10:00










  • I try to use different function but i don't arrive to the correct result. I try to use: time.set_index('dVal').groupby(pd.TimeGrouper('D')).mean().dropna() but it give me the same result above. The dtype is the same yours
    – jjgasse
    Nov 22 at 9:16












  • If I use only me = time.resample('D') and next I do a describe function for understand the situation in general me.describe(), all the value are wrog ( like max, min,mean).
    – jjgasse
    Nov 27 at 9:49


















works fine for me eVal dVal 2015-01-01 3.506160 2015-01-02 3.493167
– RomanPerekhrest
Nov 21 at 9:52






works fine for me eVal dVal 2015-01-01 3.506160 2015-01-02 3.493167
– RomanPerekhrest
Nov 21 at 9:52














You follow my step? Your eVal column is a float like mine? If you try min() or max() function, give you the correct resoult? Bacause I have a problem with this two others functions
– jjgasse
Nov 21 at 9:57






You follow my step? Your eVal column is a float like mine? If you try min() or max() function, give you the correct resoult? Bacause I have a problem with this two others functions
– jjgasse
Nov 21 at 9:57














dVal datetime64[ns] eVal float64 dtype: object
– RomanPerekhrest
Nov 21 at 10:00




dVal datetime64[ns] eVal float64 dtype: object
– RomanPerekhrest
Nov 21 at 10:00












I try to use different function but i don't arrive to the correct result. I try to use: time.set_index('dVal').groupby(pd.TimeGrouper('D')).mean().dropna() but it give me the same result above. The dtype is the same yours
– jjgasse
Nov 22 at 9:16






I try to use different function but i don't arrive to the correct result. I try to use: time.set_index('dVal').groupby(pd.TimeGrouper('D')).mean().dropna() but it give me the same result above. The dtype is the same yours
– jjgasse
Nov 22 at 9:16














If I use only me = time.resample('D') and next I do a describe function for understand the situation in general me.describe(), all the value are wrog ( like max, min,mean).
– jjgasse
Nov 27 at 9:49






If I use only me = time.resample('D') and next I do a describe function for understand the situation in general me.describe(), all the value are wrog ( like max, min,mean).
– jjgasse
Nov 27 at 9:49



















active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53408502%2fmean-of-value-by-days%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53408502%2fmean-of-value-by-days%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

404 Error Contact Form 7 ajax form submitting

How to know if a Active Directory user can login interactively

TypeError: fit_transform() missing 1 required positional argument: 'X'