-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Thinking about default "not nil" #6638
Comments
That's my fear yes. Also with the release of the "Nim in Action" book I'd like to keep version 1 compatible with the language that is described in the book. It's obvious that Nim version 2 will come sooner rather than later with breaking changes (I learned too many things during Nim's design ;-) ) and a new runtime implementation and so I thought we can do the "not nil" thing in the later versions properly. Your code is a good idea but seems too much of a workaround to embrace it officially. |
@Araq If there is a easy way to introduce default "not nil" without breaking changes, why not?
There are 3 benefits:
|
Some languages add '?' to the type, to signify that it can accept null. It could be that it complicates the language too much, to treat nil so specially. |
I realize that this is an old issue, but I have been looking into Nim recently and this issue is really one of the only problems I see with the language currently from my perspective. I understand that it may be difficult and obtrusive to make a change like this, but I do feel that it's something that should be considered! |
FYI as a source of inspiration: C# 8 is introducing non-nullable reference types after many years of not having that feature. To avoid breaking everything, you have to use a compiler flag to turn it on, and it can be disabled in regions of code as well. I have no idea how hard this would be to do for Nim, but it could allow “default not nil” types to be introduced safely and gradually without needing a Nim 2.0. |
I don't think this feature has the relevance it once had. @kvinwang can this issue be closed, because it really is outdated in its current form. |
That's not possible if you need runtime polymorphism. Also in my experience that's not what most existing Nim code does. |
I wouldn't close that: it's a valid issue and the non-ref workaround is good only for some cases/situations. The fact is that Nim code in the wild is full of ref/recursive types(just look at case objects) and people will keep using them all the time if the language provides them |
btw @jkaye2012 @bluenote10 I've worked on an alternative notnil/ref nilcheck branch in winter https://github.com/alehander42/Nim/tree/ref-nilcheck : it was almost mergeable, and it seemed to work, but it still had some gotchas(the plan was to use a pragma to limit the checking to sections of the codebase). so i am just adding this here to document an additional approach to the problem: not sure if it still makes sense noting the new ref/runtime ideas, but if somebody finds something valuable in it, it can be improved/reused |
Well, I never used
The non-ref is not a workaround, it is the better default that works for most cases (all of the code bases that I started).
Well it is true that Nim code in the wild is full of ref types. I think that is mostly the fault of the bad documentation in the past that recommended ref types as a good default. I disagree on that, and I also updated the tutorial to not recommend ref types as a default anymore. It is still explained. It is also true that there are some cases of recursive types where a ref type is currently the best option, but would
I hope people will use them more carefully, only when they really need to, as |
At the end of the day, if people have an option to do something, they're going to do it. So, if ref types are an option, and ref types are nil-able by default, you're going to end up with a lot of people using nil-able ref types all over the place no matter what documentation or best practices say. This is unfortunate, but I believe it's also a proven reality. I don't follow why anything has to be nil-able (you referenced recursive types above, but is that not an implementation detail that couldn't be changed?). The conversation should really focus more on what value nil is providing (or, conversely, what problems is it causing). I also think it makes sense that it's not a problem you personally have. As a contributor and very experienced Nim developer, I would posit you're actually much less likely to be bitten by the kind of issue that nil causes than the average programmer picking Nim up. IMO, the idea should be to make it as easy as possible for things to be correct in the average case rather than optimizing for those who really know what they're doing. |
What about # bad example but still...
import ast_pattern_matching
static:
var n: NimNode
matchAst(n) |
Well, when the documentation explicitly states the alternatives to ref types and the disadvantages of
Things have to be nilable, simply because you have to construct the values somehow. And what value do ref types have when they are not yet assigned to?
Well, what is the average programmer? Someone who learned python before? Someone who learned Java before? Someone who learned Javascript before? Someone who learned c++ before? People usually complain that they can't apply their way of thinking from the programming language they come from. People who come from languages where everything is a ref apply this way of programming to Nim. And then they want the solutions from those programming languages also in Nim, even though Nim might already have better solutions for it. |
I did not introduce the type |
Sure there is. Make it more difficult to do the thing that shouldn't be done than the thing that should be done. The language defaults should enforce what you would want people to do; one should not have to rely on documentation for that.
A non-nillable type can never be unassigned - that's actually one of the major points of the entire exercise. I'm don't have enough experience/knowledge with the language to know what you mean by "you have to construct the values somehow". What aspect of that requires nil?
It sounds like you might agree that most developers are going to tend towards what they already know from other langauges. If that's the case, wouldn't you want Nim to instead guide them towards the "better" way of doing things? Generally, I think that developers want to be able to get things done and that their default way of trying to do that is going to be applying what they already know (thus why they would reach for ref). If, instead, the default semantic were more aligned with how you would "want" it to work, it acts as a big red flag to that individual that they may be doing something in a not-so-great way. |
Many things in Nim do enforce ref types:
Or how would you solve proc operateOn(entities: seq[Entity]) without refs? I don't think its fair to blame users for using ref as it is often a necessity. Personally I hate using refs because of the mutability aspect and still I end up using them a lot because there is often no alternative. |
@bluenote10 You don't need So all that is left are |
In theory yes, but in practice that's not true. Even though you can define methods on non-ref types, it doesn't make much sense because you cannot upcast classes to their base type. For instance: type
Entity = object of RootObj
A = object of Entity
a: int
B = object of Entity
b: int
method doit(e: Entity) {.base.} =
echo "base"
method doit(e: A) =
echo "A"
method doit(e: B) =
echo "B"
# yes we can define methods on them, but what's the point
# if we cannot treat an `A` as an `Entity`?
let e: Entity = A()
# crashes at runtime with:
# Error: unhandled exception: invalid object assignment [ObjectAssignmentError] In my experience, dynamic dispatch is more or less useless without ref. Case objects violate the open-closed principle and are not an option in cases that require unbounded polymorphism. |
Do you really require unbounded polymorphism? Or do you really want to use unbounded polymorphism no matter the costs? |
The typical use case for me is combinators. Things that you can put together to form a tree of things. This happens a lot, for instance while constructing:
All these things have in common that they can be built out of smaller similar things. This requires storing them into some hetereogeneous sequence. It also requires that the user is able to define new building blocks, which requires unbounded polymorphism. And of course one needs dynamic dispatch to define recursive methods on these objects. This use case is so important that sometimes one has to bite the bullet and use refs (or closures, or some equivalent technique) |
Nothing changed for |
can this be closed? only ptr/ref/pointer objects can be [1] Note that the only exception is |
@timotheecour, it's not strictly necessary to use object variants to represent a You can do a similar trick for range types and perhaps through a general mechanism that's able to generate an "invalid value" of a particular type through some deterministic algorithm. In the future, such an optimisation can become important for types such as |
I don't see how you'd be able to express this efficiently without type Node[T] = ref object # a linked list
next: Node[T]
payload: T # eg: uint8
|
I'm leaving this issue open. I'm leaning towards a secondary "type system" via DrNim. |
@timotheecour, this example would be just type Node[T] = ref object # a linked list
next: Option[Node[T]]
payload: T # eg: uint8 My point was that let path = getOptionalFilePath()
if path.isSome:
var f = openFile(path) # path.get (or path[]) is called behind the scenes here If the compiler treats |
wouldn't it be more like |
automatic dereferencing for and here's an example with your suggested automatic dereferencing for Option[T]: import std/options
proc fun(opt: Option) =
assert opt.isSome
echo opt.type.T
type Foo[T] = object
x: int
fun(some(Foo[int](x: 1))) this would print (I'm not talking about automatic dereferencing for accessing fields of ref objects, that's a much saner language feature and many languages do this too) |
@timotheecour, the idea is not to change the type of the symbol after the assert, but just to allow dereferencing for the purposes of signature matching (and potentially to allow that only when the With that said, I'm still a bit confused about your example. Why should the FWIW, Rust has user-extensible automatic dereferencing and my understanding is that it helps them a lot in the jungle of pointer types there: https://stackoverflow.com/questions/28519997/what-are-rusts-exact-auto-dereferencing-rules With ARC coming along, we are headed towards a similar jungle. |
+1 for non nil |
I noticed that the "default 'not nil'" is no longer in the 1.0 battle plan. But this feature is really the most important one i am looking forward to. Because I think this feature will improve codes' quality in entire Nim's ecosystem.
Because it is hard to implement this feature? or Will introduce too many breaking changes?
We can introduce new types(e.g. String) which is default not nil in the stdlib, and encourage developers to use the new types. Also deprecate the old ones.
My practice:
Currently, default not nil is already throughout my app. I am feeling well because I don't have to pay much attention to handling nil values here and there.
The way i am doing this:
and define some generic prove procs:
The text was updated successfully, but these errors were encountered: