java.lang.Long cannot be cast to java.lang.Double ERROR when using MAX()
up vote
2
down vote
favorite
Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.
Some notes :
- I used the MAX function on another dataset and it was working. ( So max() works )
- I didn't have this issue before the update of dataprep yesterday, the
flow was working. - I tried many time to edit the recipe to isolate the
issue but it seems to be that MAX() function - The column i'm using MAX() on are of type INT. i tried to convert INT->
FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue
Here is the log
java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)
google-cloud-dataprep
add a comment |
up vote
2
down vote
favorite
Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.
Some notes :
- I used the MAX function on another dataset and it was working. ( So max() works )
- I didn't have this issue before the update of dataprep yesterday, the
flow was working. - I tried many time to edit the recipe to isolate the
issue but it seems to be that MAX() function - The column i'm using MAX() on are of type INT. i tried to convert INT->
FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue
Here is the log
java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)
google-cloud-dataprep
I edited my answer so you know that the issue is now fixed.
– Iñigo
Nov 21 at 8:14
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.
Some notes :
- I used the MAX function on another dataset and it was working. ( So max() works )
- I didn't have this issue before the update of dataprep yesterday, the
flow was working. - I tried many time to edit the recipe to isolate the
issue but it seems to be that MAX() function - The column i'm using MAX() on are of type INT. i tried to convert INT->
FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue
Here is the log
java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)
google-cloud-dataprep
Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.
Some notes :
- I used the MAX function on another dataset and it was working. ( So max() works )
- I didn't have this issue before the update of dataprep yesterday, the
flow was working. - I tried many time to edit the recipe to isolate the
issue but it seems to be that MAX() function - The column i'm using MAX() on are of type INT. i tried to convert INT->
FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue
Here is the log
java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)
google-cloud-dataprep
google-cloud-dataprep
edited Nov 21 at 3:54
Iñigo
772314
772314
asked Nov 20 at 9:23
Fontain
112
112
I edited my answer so you know that the issue is now fixed.
– Iñigo
Nov 21 at 8:14
add a comment |
I edited my answer so you know that the issue is now fixed.
– Iñigo
Nov 21 at 8:14
I edited my answer so you know that the issue is now fixed.
– Iñigo
Nov 21 at 8:14
I edited my answer so you know that the issue is now fixed.
– Iñigo
Nov 21 at 8:14
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
I'm with Google Cloud Platform Support.
This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).
There is a Public Issue regarding this, feel free to add info or anything you feel is needed.
EDIT: The issue is fixed now, could you try now and tell me if it worked?
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
I'm with Google Cloud Platform Support.
This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).
There is a Public Issue regarding this, feel free to add info or anything you feel is needed.
EDIT: The issue is fixed now, could you try now and tell me if it worked?
add a comment |
up vote
0
down vote
I'm with Google Cloud Platform Support.
This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).
There is a Public Issue regarding this, feel free to add info or anything you feel is needed.
EDIT: The issue is fixed now, could you try now and tell me if it worked?
add a comment |
up vote
0
down vote
up vote
0
down vote
I'm with Google Cloud Platform Support.
This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).
There is a Public Issue regarding this, feel free to add info or anything you feel is needed.
EDIT: The issue is fixed now, could you try now and tell me if it worked?
I'm with Google Cloud Platform Support.
This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).
There is a Public Issue regarding this, feel free to add info or anything you feel is needed.
EDIT: The issue is fixed now, could you try now and tell me if it worked?
edited Nov 21 at 8:13
answered Nov 20 at 15:58
Iñigo
772314
772314
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53389809%2fjava-lang-long-cannot-be-cast-to-java-lang-double-error-when-using-max%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
I edited my answer so you know that the issue is now fixed.
– Iñigo
Nov 21 at 8:14