-
Notifications
You must be signed in to change notification settings - Fork 752
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement assembly version strategy #205
Comments
The "least bad" option sounds promising. IMHO the big question is whether or not whatever option is picked can be automatically enforced across the board (not just for Rx but for any library), perhaps at the NuGet 'pack' level, so that if package authors try to pack incompatible versions for different platforms, NuGet will automatically issue an error, forcing you to explicitly opt into allowing this. It's an extremely easy mistake to make as a package author, so I think if we want allow any sort of sane plug-in model in .NET, we really need to enforce this. |
In general, I wouldn't release newer versions (even major ones) that take away API surface unless it's been through a long enough obsolete phase. |
@bartdesmet I agree that we wouldn't release new versions that take away API surface area. The issue arises because we may have different (increasing supersets) of functionality available per netstandard version, based on available functionality/types. Suppose we have the following types in netstandard1.0, assembly version 1.0.0 public static class SomeExtensions
{
public static void MethodThatWorksEverywhere(...){};
} netstandard1.3, assembly version 1.0.300 public static class SomeExtensions
{
public static void MethodThatWorksEverywhere(...){};
public static void MethodThatOnlyWorksOn13(...){};
} Now, what do we do if we rev the major version to 2.x? netstandard1.0, assembly version 2.0.0 public static class SomeExtensions
{
public static void MethodThatWorksEverywhere(...){};
} netstandard1.3, assembly version 2.0.300 public static class SomeExtensions
{
public static void MethodThatWorksEverywhere(...){};
public static void MethodThatOnlyWorksOn13(...){};
} As Make sense? |
It seems to me that the number part of the 3-part key (name, version and signing key) is not the thing to target here. Depending on the number of things that differ by platform, can there be extra things in packages that support the extra features? E.g. like a I am just spit-balling here. It seems it could be a fairly common issue. Do the Nuget/ASP/Core teas have any patterns for solving this stuff? |
@LeeCampbell The catch here is that it's not "by platform," it's by netstandard version. So that makes factoring very hard if we split things so there's a Rx.netstandard1, Rx.Netstandard13, esp when they're building on the same types. At the end of the day, this is really only an issue for a narrow set of cases where you have multiple libraries that have a dependency on Rx/Ix all loaded into another process without being able to specify NuGet dependencies. You have a greater potential for mixing versions of Rx that you don't control. If you're building an app/website/library, and you're using the NuGet packages / specifying NuGet dependnencies, that'll always work right. |
Thanks for the clarification. Yes it is something that I have faced before, and also solved (as a consumer) as you suggested via separate App Domains or ILMerge. Both of which seem less invasive that forking the whole repo, rebuilding, signing, referencing and shipping (IMO). However, if there is a way to achieve this 'correctly' I would love to know. (i.e. is Rx the only lib facing this issue?) |
Another thought, can each platform build be signed by a different key. Taking the viewpoint that the Version number is not the place to try to fix this, and as @onovotny points out the name is cumbersome place to fix this. However as it appears we have a known set of platform (groups), can each of these have a unique key, thus avoiding collisions for the same named and versioned assembly? e.g.
becomes
|
You wouldn't want to sign by a different key as that'd break normal usage. Suppose you had a That's why my proposal is the "least bad." The CoreFX team has the same issue and they have a big table keeping track of this: I'm essentially suggesting a similar thing. Note that this is a problem other libraries may face. The question has to do with external surface area. If it's just an implementation that's different (like bait-and-switch PCL's before), then the versions can be the same. If the external surface area is different/increasing, then assembly versioning becomes important. Most NuGet's that provide a net4 and a net45 version would likely have the same issue/broken-ness if their external area was different. This is only becoming more wide-spread an issue with different levels of netstandard and the ease in which xproj makes it to target multiple framework versions. My proposal here would fix the issue, the downside is that the assembly versions may not match the NuGet versions and we'd still have the issue if we decide to rev the Suppose we implement #199, many of the assemblies will go away. That would seem to be a case where reving the major is a no-brainer as assembly binding redirects wouldn't work anyway. |
This all seems reasonable, especially if we're very sensitive and cautious regarding managing changes to the API surface... |
@onovotny Isn't this "problem" simply an indication that the public Rx API should really always be the same in all NuGet platform versions? If the different platform assemblies supported by the same NuGet package do not have the same API, I think what this means is that you should package them as distinct NuGet packages entirely. I don't think hacking the version system in this way is healthy at all and I would strongly advise against it. Are there really many cases like this in Rx right now? If there are only a couple of particular cases you may be able to work around it by throwing NotSupportedExceptions. Simply put, in my head NuGet packages are all about distributing APIs. If I download one version of a NuGet package, I expect to get the same API across the board for all platforms. All specific platform differences should be handled inside the internal implementation. |
@glopesdev Not having different surface area per platform is highly limiting as it limits us to the least common denominator. It's precisely what .NET Core and the .NET Standard aim to avoid. Having different packages doesn't help either as packages aren't used by the runtime to load things. You'd still have the issue if one project used The CoreFX packages do a similar thing to what we're doing, but even stricter. As mentioned in the summary, this is the "least bad" option that still enables things to work correctly. It only affects a narrow set of uses as well -- someone building an app won't hit this. |
I understand and agree with that reasoning regarding different platforms. But somehow handling different platforms still feels different from handling different versions of the same platform, but maybe that's just me and you can bundle everything in the same mechanism. I guess more importantly the question is around what should be specified explicitly by the developer versus what can be resolved implicitly by the runtime loader and I think these two scenarios should be kept consistent with each other somehow. For example, I'm assuming the runtime loader doesn't do binding redirects across platforms (e.g. from .NET core to .NET standard). However, if you do want to treat different versions of the same platform as different platforms themselves, then by the same reasoning the runtime loader should never do binding redirects between them either. My point is that something feels inconsistent here. What I have to specify explicitly as a developer should match with what is assumed implicitly by the runtime. Does this make sense? |
I'm implementing this now, but it'll be extra painful until this is fixed: Doesn't seem we can keep the Update: there's still a bug in their logic but there's a workaround. One version file per solution (Ix/Rx is possible) |
I found this comment illuminating:
It seems that the underlying problem is that the identity of an assembly is a different concept between nuget and the runtime loader. For the runtime, the identity is Name+Version+Key, while Nuget only considers Name+Version. A solution that "works" for nuget will potentially break the runtime, and vice-versa (as exemplified by the "sign platform APIs with different keys" solution and the above comment). I'm not sure I agree that the .NET plugin model as a whole is broken, I think that the fact that nuget and the runtime use different assembly/package identity models is what's breaking all of this in the first place. I'm not sure there is any way to reconcile these two models, tbh. This banding approach feels like a reasonable compromise. 👍 |
After a long discussion with @ericstj about #97, it seems like there's no really good options here, just one possibly "least bad" one.
To summarize the issue, when a package provides multiple assemblies that work on a platform the AssemblyVersion for each "higher/wider" version needs to be greater to ensure binding redirects work as expected. This means we cannot simply use the NuGet package version for all assemblies in the package. This is true whether it's
net40
andnet45
ornetstandard1.0
,netstandard1.1
andnetstandard1.3
as we have today. The core issue is the same -- higher versions have a superset of the functionality as lower-platform versions.In assembly versioning, it's generally considered the case that a higher assembly version contains all of the functionality of a lower one. This leads to the following "least bad" option:
netstandard1.0
:3.0.0
-3.0.999
netstandard1.1
:3.0.1000
-3.0.1999
netstandard1.3
:3.0.3000
-3.0.3999
net45
:3.0.0
-3.0.999
net46
:3.0.3000
-3.0.3999
win8
:3.0.1000
-3.0.1999
wp8
:3.0.0
-3.0.999
win81
:3.0.2000
-3.0.2999
wpa81
:3.0.2000
-3.0.2999
uap10.0
:3.0.4000
-3.0.4999
netcoreapp1.0
:3.0.6000
-3.0.6999
Given this model, we would increment
Minor
andPatch
versions on the NuGet package as appropriate for the release, but each release would increment a value in the given range for the assembly. This will ensure that something isn't accidentally redirected to a higher assembly version with fewer/missing API surface area.One open question is
Major
versions. Given that major versions are generally considered able to have breaking changes, it may be fair to bump the major assembly version to match. This could potentially lead to an issue if a3.0.3000
assembly gets redirected to a4.0.0
version that has less surface area. For a major release, this might be acceptable.Thoughts?
/cc @bartdesmet @paulcbetts @mattpodwysocki @shiftkey @LeeCampbell @shana @forki @OmerRaviv @natemcmaster
The text was updated successfully, but these errors were encountered: