Making new language features #{ Stand Out }
There is a phenomenon I've noticed in the way community members react to proposed new language features. The new feature seems odd, and different, and contrary to the way the language was before. So they want the syntax of the feature to be odd, and different, and clearly distinguish the new feature from all of the old features of the language.
Succumbing to that request is a bad idea. When you do, (some) people are at first happy that they can so easily distinguish this new feature from the rest of the language. But users soon grow tired of the extra syntax surrounding the new feature, and request as yet another new feature that they be able to do the same thing, but without all the boilerplate.
It is better to design the feature so that it fits well with the existing features of the language, even if it might at first seem jarring that things have changed. Users will quickly get over the newness of the changes and learn to understand the language as a whole as it is (after the change).
6 comments:
Sounds like a corollary to Stroustrup's Rule[1].
[1]: http://lambda-the-ultimate.org/node/5402
The #{ Stand Out } features introduce permanent sideshows
in the language with their own ghettoized programming patterns.
I'm thinking of languages which can embed SQL or XML (hi Scala!)
but many less severe examples are out there. Format-strings and
regex. notations are an edge case: Do we need them or not?
We do, but the should be built on top of library support for
concise notations, rather than as an embedded concise notation.
A language is powerful when the contribution C of each of N features
combines cleanly with most other features; then your expressiveness
is C**N. A language which adds N sideshows gets C*N expressiveness
since each ghetto adds nothing to the other ghettos.
Stroustrup's rule (nice reference!) says folks are threatened by C**N
and demand C*N until they want that fruitful interoperation at C**N.
A new feature should be taught as if it were in a ghetto.
Just wondering, what examples have you got in mind for this phenomenon?
@Peter The latest example I ran into was selecting a syntax for a "discard" in pattern-matching in C#. The traditional syntax for that in functional languages is _. Some people argued that was too subtle and hard to see in source. But a discard is by design "low profile", and there is no profile lower than the underbar.
Another example is the use of the "default" keyword in Java's default interface methods. It adds nothing but noise to the syntax.
Actually I believe the biggest complaint regarding discard was the fact that the underbar is already a legal identifier in C# and that there is overlap as to when it can be used as an identifier v. when it can be used as a discard:
public class Foo
{
private int _;
public bool IsNumber(string input)
{
return double.TryParse(input, out _); // oops!
}
}
Otherwise adopting the underbar for discard was a great idea, but this felt more like a desire to copy the syntax from other languages rather than adopting it to fit and feel like C#.
Personally I would've much preferred it if C# did use underbar as a discard and deprecated its use as an identifier. That is the route that Java has taken and underbar is no longer a legal identifier in Java 9.
Post a Comment