Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: spike enabling test diagnostics #339

Closed
wants to merge 10 commits into from

Conversation

codonnell
Copy link

@codonnell codonnell commented Nov 15, 2023

Compiles and returns compiler diagnostics for tests when mix_env = "test" is included in the next-ls init options. Should have no effect with the default "dev" mix_env.

TODO

  • Get initial implementation working
  • Write integration test
  • Ensure we can get test diagnostics from a set of well known elixir repos
    • next-ls
    • phoenix
    • ecto
    • credo
    • oban
  • Ensure we still get compiler diagnostics from lib when test helpers fail to load

To get this working initially, hard-codes compiles to run with mix_env == :test
@mhanberg
Copy link
Collaborator

Do we need to handle files that aren't in the context of a mix project?

Ideally this would be nice so that we can use next ls for "single files" like scripts

Do we need to pull compilation or mix test arguments from somewhere?

I'm not sure i am understanding the purpose of having those options. We aren't actually running the tests, just compiling those files.

@mhanberg
Copy link
Collaborator

Should we run parallel runtimes, one for test and one for dev? Or just compile with mix_env = :test?

it doesn't seem like you are actually starting the runtime with MIX_ENV=test (unless you are doing that locally with the mix_env init option).

I think it would be fine to, on save, send two compilation requests, one for the project as normal and one for the tests (potentially if you are saving a test file, we only trigger a compilation for the test, as the project code wouldn't be different, but it will be a noop any so might be quick enough).

starting two runtimes i think might be a no go

@mhanberg
Copy link
Collaborator

Work out what to do with test warnings
Figure out if we should warn on *_test.ex test files

I am a little confused on what these questions mean, we would want diagnostics to get published to the editor.

@codonnell
Copy link
Author

Ideally this would be nice so that we can use next ls for "single files" like scripts

I can look into what this might look like; maybe just try to run Kernel.ParallelCompiler.require/2 on the file?

I'm not sure i am understanding the purpose of having those options. We aren't actually running the tests, just compiling those files.

This was more a note to self, because the mix test code I was pulling from has some options, and I wasn't sure if there might be some that would be important to load the code in some projects. Sorry, I jotted these notes down quickly, and it would've been helpful to add a little more context.

@codonnell
Copy link
Author

Should we run parallel runtimes, one for test and one for dev? Or just compile with mix_env = :test?

it doesn't seem like you are actually starting the runtime with MIX_ENV=test (unless you are doing that locally with the mix_env init option).

I think it would be fine to, on save, send two compilation requests, one for the project as normal and one for the tests (potentially if you are saving a test file, we only trigger a compilation for the test, as the project code wouldn't be different, but it will be a noop any so might be quick enough).

starting two runtimes i think might be a no go

The difficulty I see with only having one runtime (with mix_env = :dev) is if there are test dependencies that make it so the code cannot be loaded in the dev environment. I'm relatively new to elixir, so I'm unsure how common that is in practice.

@codonnell
Copy link
Author

Work out what to do with test warnings
Figure out if we should warn on *_test.ex test files

I am a little confused on what these questions mean, we would want diagnostics to get published to the editor.

This was also more of a note to self; warnings are returned separately from errors and compile results by Kernel.ParallelCompiler.require, and I wasn't sure if they had the same format. I just left them out for the moment. I agree they should be included. 👍

I posted in discord because I was excited to see an error in the editor. I do see that with this code, only the initial compile appears to generate diagnostics--there is definitely more work to do to ensure this is robust and reliable.

@codonnell
Copy link
Author

Spike code is working well for me in the next-ls repo now, and I have a passing integration test. I'll continue to verify while I work through the remaining todos.

@codonnell
Copy link
Author

Looking more into the dev vs. test profile question, it seems like attempting to compile the tests with the dev profile is unlikely to work in many cases. I was previously testing locally with some code that set the runtime's mix env to :test; when I remove that, compilation fails in the next-ls repo because the bypass dependency isn't available. I added a commented out line to this PR that you can toggle if you want to play with this yourself.

I'm not familiar enough with the elixir ecosystem to be really confident in the suggestion, but my feeling is that most dev dependencies will be for mix tasks that generate documentation, lint, etc. or developer tools that can be used from iex and not part of the actual compiled codebase. It's possible using test as the runtime's mix env would provide a better UX for most people. Given the strong divide between dev and test compilation targets, I don't see an option other than picking one, which naturally will break if it encounters a dependency from the other profile, or eating the overhead of compiling with both targets.

Apologies for some poor communication on this; I'm finally on the other side of fighting through a nasty head cold, and it was harder than I would like to think things through clearly. Hopefully what I've written above makes sense.

@codonnell
Copy link
Author

And re: getting diagnostics on standalone exs files, that feels like a larger effort. I went to try to test it and discovered that the vim plugin only starts a workspace when it finds a mix file: https://github.com/elixir-tools/elixir-tools.nvim/blob/8f573b0b089567aa9c544b029000c97e715e5f58/lua/elixir/nextls/init.lua#L76. The runtime also depends heavily on mix commands. That would be a nice feature, but perhaps it can be delivered separately from surfacing test diagnostics in mix projects.

@codonnell codonnell marked this pull request as ready for review November 22, 2023 09:17
@codonnell
Copy link
Author

I cleaned up the code and went with the implementation that I think makes sense, where tests are only compiled if the user sets mix_env = "test" in their next-ls init options.

I tested this against the main branches of next-ls, ecto, phoenix, credo, and oban. Open to suggestions if you have thoughts on other repos that might be good to check. It works for next-ls, phoenix, and credo, but not for ecto and oban. In particular, the test helper files fail to load successfully in both ecto and oban. I'm going to do a bit of refactoring to ensure that non-test diagnostics are reported if test helpers fail to load as well as see if I can get to the bottom of what's going wrong in ecto and oban.

@codonnell
Copy link
Author

codonnell commented Nov 22, 2023

Alright, I believe I understand what's going on with ecto and oban. In both cases, the test_helper.exs file is not just loading test helper code, but also starting ancillary applications. Ecto is starting a test repo here, and oban is starting a few test repos here. These extra applications rely on the fact that the mix application has already been started.

Because there is no separation between requiring test helper code and starting test applications, I believe we'll need to start the mix application, like mix test does here. Does that make sense to you? Do you see any alternative?

@mhanberg
Copy link
Collaborator

Why do you need to compile the test helper file?

@codonnell
Copy link
Author

Why do you need to compile the test helper file?

Because it can load modules that the tests depend on. For example, ecto tests rely on the test repo module loaded here.

@mhanberg
Copy link
Collaborator

I think this might take some more hacking/thinking, so let's put a pause on this for now.

I really appreciate the effort! We've learned a lot.

@codonnell
Copy link
Author

codonnell commented Nov 30, 2023

No worries, I understand. This is more difficult and a bigger change than expected initially. Pushed up what I have, which works reliably and quickly on all repos above except ecto. This way if someone wants to pick it up in the future, the work is here.

Documenting the remaining known bug at the moment: in ecto, there are some spurious warnings that seem related to a difference between require and compile that I don't understand. If I use Kernel.ParallelCompiler.require instead of Kernel.ParallelCompiler.compile to get test diagnostics, the warnings disappear. But Kernel.ParallelCompiler.require will not reload a file after it's been loaded once. As far as I can tell, the difference between the two comes down to which of these two functions gets called. Not reloading a file that's already been loaded makes sense; maybe the difference comes down to the error handler? Just haven't tracked it down yet.

It might also make sense to have running app.start gated on an option, preferably at the project level. Starting an application could end up using a port or something that could cause problems, and most projects probably don't need it to compile tests. Of the 5 I tried, only oban and ecto did.

@codonnell codonnell closed this Nov 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants