Why the GPU performance changed so much for different data types
Use a piece of code to test the different of performance with CPU and GPU, I found the performance diff is very interesting.
float16 has the worst performance, but the improvement of GPU is most obvious.
float32 has the best performance, but the improvement of GPU is limited.
float64 is better than float16, but gpu is not as good as CPU.
Can any one here can share some insights on why?
My hardware is Xeon E5-1650 + Quadro K620.
This is my testing code.
from __future__ import print_function
import matplotlib
import matplotlib.pyplot as plt
import tensorflow as tf
import time
def get_times(maximum_time):
device_times = {
"/gpu:0":,
"/cpu:0":
}
matrix_sizes = range(500,50000,50)
for size in matrix_sizes:
for device_name in device_times.keys():
print("####### Calculating on the " + device_name + " #######")
shape = (size,size)
data_type = tf.float16
with tf.device(device_name):
r1 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
r2 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
dot_operation = tf.matmul(r2, r1)
with tf.Session(config=tf.ConfigProto(log_device_placement=True)) as session:
start_time = time.time()
result = session.run(dot_operation)
time_taken = time.time() - start_time
print(result)
device_times[device_name].append(time_taken)
print(device_times)
if time_taken > maximum_time:
return device_times, matrix_sizes
device_times, matrix_sizes = get_times(1.5)
gpu_times = device_times["/gpu:0"]
cpu_times = device_times["/cpu:0"]
plt.plot(matrix_sizes[:len(gpu_times)], gpu_times, 'o-')
plt.plot(matrix_sizes[:len(cpu_times)], cpu_times, 'o-')
plt.ylabel('Time')
plt.xlabel('Matrix size')
plt.show()
tensorflow gpu
add a comment |
Use a piece of code to test the different of performance with CPU and GPU, I found the performance diff is very interesting.
float16 has the worst performance, but the improvement of GPU is most obvious.
float32 has the best performance, but the improvement of GPU is limited.
float64 is better than float16, but gpu is not as good as CPU.
Can any one here can share some insights on why?
My hardware is Xeon E5-1650 + Quadro K620.
This is my testing code.
from __future__ import print_function
import matplotlib
import matplotlib.pyplot as plt
import tensorflow as tf
import time
def get_times(maximum_time):
device_times = {
"/gpu:0":,
"/cpu:0":
}
matrix_sizes = range(500,50000,50)
for size in matrix_sizes:
for device_name in device_times.keys():
print("####### Calculating on the " + device_name + " #######")
shape = (size,size)
data_type = tf.float16
with tf.device(device_name):
r1 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
r2 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
dot_operation = tf.matmul(r2, r1)
with tf.Session(config=tf.ConfigProto(log_device_placement=True)) as session:
start_time = time.time()
result = session.run(dot_operation)
time_taken = time.time() - start_time
print(result)
device_times[device_name].append(time_taken)
print(device_times)
if time_taken > maximum_time:
return device_times, matrix_sizes
device_times, matrix_sizes = get_times(1.5)
gpu_times = device_times["/gpu:0"]
cpu_times = device_times["/cpu:0"]
plt.plot(matrix_sizes[:len(gpu_times)], gpu_times, 'o-')
plt.plot(matrix_sizes[:len(cpu_times)], cpu_times, 'o-')
plt.ylabel('Time')
plt.xlabel('Matrix size')
plt.show()
tensorflow gpu
add a comment |
Use a piece of code to test the different of performance with CPU and GPU, I found the performance diff is very interesting.
float16 has the worst performance, but the improvement of GPU is most obvious.
float32 has the best performance, but the improvement of GPU is limited.
float64 is better than float16, but gpu is not as good as CPU.
Can any one here can share some insights on why?
My hardware is Xeon E5-1650 + Quadro K620.
This is my testing code.
from __future__ import print_function
import matplotlib
import matplotlib.pyplot as plt
import tensorflow as tf
import time
def get_times(maximum_time):
device_times = {
"/gpu:0":,
"/cpu:0":
}
matrix_sizes = range(500,50000,50)
for size in matrix_sizes:
for device_name in device_times.keys():
print("####### Calculating on the " + device_name + " #######")
shape = (size,size)
data_type = tf.float16
with tf.device(device_name):
r1 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
r2 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
dot_operation = tf.matmul(r2, r1)
with tf.Session(config=tf.ConfigProto(log_device_placement=True)) as session:
start_time = time.time()
result = session.run(dot_operation)
time_taken = time.time() - start_time
print(result)
device_times[device_name].append(time_taken)
print(device_times)
if time_taken > maximum_time:
return device_times, matrix_sizes
device_times, matrix_sizes = get_times(1.5)
gpu_times = device_times["/gpu:0"]
cpu_times = device_times["/cpu:0"]
plt.plot(matrix_sizes[:len(gpu_times)], gpu_times, 'o-')
plt.plot(matrix_sizes[:len(cpu_times)], cpu_times, 'o-')
plt.ylabel('Time')
plt.xlabel('Matrix size')
plt.show()
tensorflow gpu
Use a piece of code to test the different of performance with CPU and GPU, I found the performance diff is very interesting.
float16 has the worst performance, but the improvement of GPU is most obvious.
float32 has the best performance, but the improvement of GPU is limited.
float64 is better than float16, but gpu is not as good as CPU.
Can any one here can share some insights on why?
My hardware is Xeon E5-1650 + Quadro K620.
This is my testing code.
from __future__ import print_function
import matplotlib
import matplotlib.pyplot as plt
import tensorflow as tf
import time
def get_times(maximum_time):
device_times = {
"/gpu:0":,
"/cpu:0":
}
matrix_sizes = range(500,50000,50)
for size in matrix_sizes:
for device_name in device_times.keys():
print("####### Calculating on the " + device_name + " #######")
shape = (size,size)
data_type = tf.float16
with tf.device(device_name):
r1 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
r2 = tf.random_uniform(shape=shape, minval=0, maxval=1, dtype=data_type)
dot_operation = tf.matmul(r2, r1)
with tf.Session(config=tf.ConfigProto(log_device_placement=True)) as session:
start_time = time.time()
result = session.run(dot_operation)
time_taken = time.time() - start_time
print(result)
device_times[device_name].append(time_taken)
print(device_times)
if time_taken > maximum_time:
return device_times, matrix_sizes
device_times, matrix_sizes = get_times(1.5)
gpu_times = device_times["/gpu:0"]
cpu_times = device_times["/cpu:0"]
plt.plot(matrix_sizes[:len(gpu_times)], gpu_times, 'o-')
plt.plot(matrix_sizes[:len(cpu_times)], cpu_times, 'o-')
plt.ylabel('Time')
plt.xlabel('Matrix size')
plt.show()
tensorflow gpu
tensorflow gpu
asked Nov 21 at 10:11
Capemer
365
365
add a comment |
add a comment |
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53409699%2fwhy-the-gpu-performance-changed-so-much-for-different-data-types%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53409699%2fwhy-the-gpu-performance-changed-so-much-for-different-data-types%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown