Why are there no octal literals in C#?
Why didn't those who develop C# include octal literals beyond hexadecimal and binary literals? What would seem to be the reason for that decision?
c#
add a comment |
Why didn't those who develop C# include octal literals beyond hexadecimal and binary literals? What would seem to be the reason for that decision?
c#
1
I strongly suspect it's because they're rarely useful, but they can be extremely annoying. I've certainly always found it strange that in Java,int x = 010;
is not the same asint x = 10;
. I wouldn't mind with an explicit specifier, but just making a leading zero significant is horrible IMO.
– Jon Skeet
Nov 24 '18 at 13:14
@HansPassant its not about dominance, octal is still a very useful base for triplets. This can also be seen in chmod.
– Minn
Nov 24 '18 at 13:21
yeah but octal literals can also be displayed by 0o, wouldn't that be even more logical that just 0?
– Марк Павлович
Nov 24 '18 at 13:25
add a comment |
Why didn't those who develop C# include octal literals beyond hexadecimal and binary literals? What would seem to be the reason for that decision?
c#
Why didn't those who develop C# include octal literals beyond hexadecimal and binary literals? What would seem to be the reason for that decision?
c#
c#
edited Nov 28 '18 at 18:41
Stephen Kennedy
7,329135068
7,329135068
asked Nov 24 '18 at 13:08
Марк ПавловичМарк Павлович
717
717
1
I strongly suspect it's because they're rarely useful, but they can be extremely annoying. I've certainly always found it strange that in Java,int x = 010;
is not the same asint x = 10;
. I wouldn't mind with an explicit specifier, but just making a leading zero significant is horrible IMO.
– Jon Skeet
Nov 24 '18 at 13:14
@HansPassant its not about dominance, octal is still a very useful base for triplets. This can also be seen in chmod.
– Minn
Nov 24 '18 at 13:21
yeah but octal literals can also be displayed by 0o, wouldn't that be even more logical that just 0?
– Марк Павлович
Nov 24 '18 at 13:25
add a comment |
1
I strongly suspect it's because they're rarely useful, but they can be extremely annoying. I've certainly always found it strange that in Java,int x = 010;
is not the same asint x = 10;
. I wouldn't mind with an explicit specifier, but just making a leading zero significant is horrible IMO.
– Jon Skeet
Nov 24 '18 at 13:14
@HansPassant its not about dominance, octal is still a very useful base for triplets. This can also be seen in chmod.
– Minn
Nov 24 '18 at 13:21
yeah but octal literals can also be displayed by 0o, wouldn't that be even more logical that just 0?
– Марк Павлович
Nov 24 '18 at 13:25
1
1
I strongly suspect it's because they're rarely useful, but they can be extremely annoying. I've certainly always found it strange that in Java,
int x = 010;
is not the same as int x = 10;
. I wouldn't mind with an explicit specifier, but just making a leading zero significant is horrible IMO.– Jon Skeet
Nov 24 '18 at 13:14
I strongly suspect it's because they're rarely useful, but they can be extremely annoying. I've certainly always found it strange that in Java,
int x = 010;
is not the same as int x = 10;
. I wouldn't mind with an explicit specifier, but just making a leading zero significant is horrible IMO.– Jon Skeet
Nov 24 '18 at 13:14
@HansPassant its not about dominance, octal is still a very useful base for triplets. This can also be seen in chmod.
– Minn
Nov 24 '18 at 13:21
@HansPassant its not about dominance, octal is still a very useful base for triplets. This can also be seen in chmod.
– Minn
Nov 24 '18 at 13:21
yeah but octal literals can also be displayed by 0o, wouldn't that be even more logical that just 0?
– Марк Павлович
Nov 24 '18 at 13:25
yeah but octal literals can also be displayed by 0o, wouldn't that be even more logical that just 0?
– Марк Павлович
Nov 24 '18 at 13:25
add a comment |
2 Answers
2
active
oldest
votes
Octal encoding is a relic from computing in the late 1950s and 60s. Back then it was quite useful, companies built machines with a 6-bit byte, commonly packed in an 18-bit or 36-bit word. 6-bits was enough for everybody, teletypes and printers did not yet have lowercase letters and English was the dominant language.
Octal is nice for such 6-bit bytes, takes 2 digits and all bits are used.
That did peter out, inevitably, the IBM-360 was very influential and it had an 8-bit byte. The PDP-11 of 1970 was important, an affordable 16-bit machine with an 8-bit byte. Around 1975 the olden architectures acquired dinosaur status and programmers started heavily favoring hex. Octal is clumsy to encode 8 bit bytes, hex gave us the 2 digits back. Microprocessor kits of the era all used hex.
Octal did last a lot longer than it should have. DEC manuals always used octal notation, even for the 8-bitters. The C language brought it into the curly-brace languages, narrowly, it started life on GE-635 and PDP-8, 6-bit machines. Narrowly, it didn't become real C until the PDP-11, but the seed was planted. With a way to specify octal values that was far too easy, a leading 0 on the literal.
Producing countless bugs and extremely confused programmers. The C# team thoroughly removed such common bug generators from their curly-brace language. They did a terrific job.
add a comment |
A lot of people use 0 as prefix for visual alignments, having octal literals is commonly seen as footgun in programming languages.
var x = 11;
var y = 011;
This can happen by accident while hex and binary literals require a special alphanumeric prefix 0x
or 0b
. Modern languages try to reduce these kinds of footguns.
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
1
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53458467%2fwhy-are-there-no-octal-literals-in-c%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Octal encoding is a relic from computing in the late 1950s and 60s. Back then it was quite useful, companies built machines with a 6-bit byte, commonly packed in an 18-bit or 36-bit word. 6-bits was enough for everybody, teletypes and printers did not yet have lowercase letters and English was the dominant language.
Octal is nice for such 6-bit bytes, takes 2 digits and all bits are used.
That did peter out, inevitably, the IBM-360 was very influential and it had an 8-bit byte. The PDP-11 of 1970 was important, an affordable 16-bit machine with an 8-bit byte. Around 1975 the olden architectures acquired dinosaur status and programmers started heavily favoring hex. Octal is clumsy to encode 8 bit bytes, hex gave us the 2 digits back. Microprocessor kits of the era all used hex.
Octal did last a lot longer than it should have. DEC manuals always used octal notation, even for the 8-bitters. The C language brought it into the curly-brace languages, narrowly, it started life on GE-635 and PDP-8, 6-bit machines. Narrowly, it didn't become real C until the PDP-11, but the seed was planted. With a way to specify octal values that was far too easy, a leading 0 on the literal.
Producing countless bugs and extremely confused programmers. The C# team thoroughly removed such common bug generators from their curly-brace language. They did a terrific job.
add a comment |
Octal encoding is a relic from computing in the late 1950s and 60s. Back then it was quite useful, companies built machines with a 6-bit byte, commonly packed in an 18-bit or 36-bit word. 6-bits was enough for everybody, teletypes and printers did not yet have lowercase letters and English was the dominant language.
Octal is nice for such 6-bit bytes, takes 2 digits and all bits are used.
That did peter out, inevitably, the IBM-360 was very influential and it had an 8-bit byte. The PDP-11 of 1970 was important, an affordable 16-bit machine with an 8-bit byte. Around 1975 the olden architectures acquired dinosaur status and programmers started heavily favoring hex. Octal is clumsy to encode 8 bit bytes, hex gave us the 2 digits back. Microprocessor kits of the era all used hex.
Octal did last a lot longer than it should have. DEC manuals always used octal notation, even for the 8-bitters. The C language brought it into the curly-brace languages, narrowly, it started life on GE-635 and PDP-8, 6-bit machines. Narrowly, it didn't become real C until the PDP-11, but the seed was planted. With a way to specify octal values that was far too easy, a leading 0 on the literal.
Producing countless bugs and extremely confused programmers. The C# team thoroughly removed such common bug generators from their curly-brace language. They did a terrific job.
add a comment |
Octal encoding is a relic from computing in the late 1950s and 60s. Back then it was quite useful, companies built machines with a 6-bit byte, commonly packed in an 18-bit or 36-bit word. 6-bits was enough for everybody, teletypes and printers did not yet have lowercase letters and English was the dominant language.
Octal is nice for such 6-bit bytes, takes 2 digits and all bits are used.
That did peter out, inevitably, the IBM-360 was very influential and it had an 8-bit byte. The PDP-11 of 1970 was important, an affordable 16-bit machine with an 8-bit byte. Around 1975 the olden architectures acquired dinosaur status and programmers started heavily favoring hex. Octal is clumsy to encode 8 bit bytes, hex gave us the 2 digits back. Microprocessor kits of the era all used hex.
Octal did last a lot longer than it should have. DEC manuals always used octal notation, even for the 8-bitters. The C language brought it into the curly-brace languages, narrowly, it started life on GE-635 and PDP-8, 6-bit machines. Narrowly, it didn't become real C until the PDP-11, but the seed was planted. With a way to specify octal values that was far too easy, a leading 0 on the literal.
Producing countless bugs and extremely confused programmers. The C# team thoroughly removed such common bug generators from their curly-brace language. They did a terrific job.
Octal encoding is a relic from computing in the late 1950s and 60s. Back then it was quite useful, companies built machines with a 6-bit byte, commonly packed in an 18-bit or 36-bit word. 6-bits was enough for everybody, teletypes and printers did not yet have lowercase letters and English was the dominant language.
Octal is nice for such 6-bit bytes, takes 2 digits and all bits are used.
That did peter out, inevitably, the IBM-360 was very influential and it had an 8-bit byte. The PDP-11 of 1970 was important, an affordable 16-bit machine with an 8-bit byte. Around 1975 the olden architectures acquired dinosaur status and programmers started heavily favoring hex. Octal is clumsy to encode 8 bit bytes, hex gave us the 2 digits back. Microprocessor kits of the era all used hex.
Octal did last a lot longer than it should have. DEC manuals always used octal notation, even for the 8-bitters. The C language brought it into the curly-brace languages, narrowly, it started life on GE-635 and PDP-8, 6-bit machines. Narrowly, it didn't become real C until the PDP-11, but the seed was planted. With a way to specify octal values that was far too easy, a leading 0 on the literal.
Producing countless bugs and extremely confused programmers. The C# team thoroughly removed such common bug generators from their curly-brace language. They did a terrific job.
answered Nov 24 '18 at 15:18
community wiki
Hans Passant
add a comment |
add a comment |
A lot of people use 0 as prefix for visual alignments, having octal literals is commonly seen as footgun in programming languages.
var x = 11;
var y = 011;
This can happen by accident while hex and binary literals require a special alphanumeric prefix 0x
or 0b
. Modern languages try to reduce these kinds of footguns.
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
1
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
add a comment |
A lot of people use 0 as prefix for visual alignments, having octal literals is commonly seen as footgun in programming languages.
var x = 11;
var y = 011;
This can happen by accident while hex and binary literals require a special alphanumeric prefix 0x
or 0b
. Modern languages try to reduce these kinds of footguns.
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
1
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
add a comment |
A lot of people use 0 as prefix for visual alignments, having octal literals is commonly seen as footgun in programming languages.
var x = 11;
var y = 011;
This can happen by accident while hex and binary literals require a special alphanumeric prefix 0x
or 0b
. Modern languages try to reduce these kinds of footguns.
A lot of people use 0 as prefix for visual alignments, having octal literals is commonly seen as footgun in programming languages.
var x = 11;
var y = 011;
This can happen by accident while hex and binary literals require a special alphanumeric prefix 0x
or 0b
. Modern languages try to reduce these kinds of footguns.
answered Nov 24 '18 at 13:15
MinnMinn
782216
782216
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
1
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
add a comment |
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
1
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
yeah but it still could be developed in a 0o way or something like that, and in your particular example one can use _ instead of 0, can't they?
– Марк Павлович
Nov 24 '18 at 13:19
1
1
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
Its an uncommon use-case so they probably decided to just not include it at all.
– Minn
Nov 24 '18 at 13:20
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53458467%2fwhy-are-there-no-octal-literals-in-c%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
I strongly suspect it's because they're rarely useful, but they can be extremely annoying. I've certainly always found it strange that in Java,
int x = 010;
is not the same asint x = 10;
. I wouldn't mind with an explicit specifier, but just making a leading zero significant is horrible IMO.– Jon Skeet
Nov 24 '18 at 13:14
@HansPassant its not about dominance, octal is still a very useful base for triplets. This can also be seen in chmod.
– Minn
Nov 24 '18 at 13:21
yeah but octal literals can also be displayed by 0o, wouldn't that be even more logical that just 0?
– Марк Павлович
Nov 24 '18 at 13:25