Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test renderer auto profiling #164476

Closed
5 tasks done
jrieken opened this issue Oct 24, 2022 · 10 comments
Closed
5 tasks done

Test renderer auto profiling #164476

jrieken opened this issue Oct 24, 2022 · 10 comments
Labels
testplan-item verified Verification succeeded
Milestone

Comments

@jrieken
Copy link
Member

jrieken commented Oct 24, 2022

Refs: #163819

Complexity: 4

Create Issue


We have added and extended infrastructure to automatically profile the renderer process and to automatically extract heavy traces. Testing this requires to freeze VS Code - something that shouldn't happen 🙃 To prepare do the following

  • enable this feature via: application.experimental.rendererProfiling
  • reveal the window log via "F1 > Show Logs > Window"
  • in the log look out for a message reading [info] [perf] Render performance baseline is NNms. This is the "perfBaseline" for your machine

To test this works you need to put the renderer under pressure. Be creating and out on the look but start with a good and easy reliable way to causing a stall

  • download TypeScript's checker.ts-file here:https://raw.githubusercontent.com/microsoft/TypeScript/main/src/compiler/checker.ts
  • open the file VS Code
  • in the window log keep an eye open for messages like Renderer reported VERY LONG TASK (NNms), starting profiling session. (A very long task is defined as 10*perfBaseline)
  • once the message appears the renderer is profiled for 5 seconds and the profile is then analysed
  • look out for error messages like [error] [PerfSampleError]|852ms by <<renderer>> These are synthetic errors and their "stack traces" is the heaviest call path (on potentially many) into the heaviest function.
  • ensure these synthetic errors have no full paths in the "stack traces" (PII)
  • look out for a final message like [perf] stored profile to DISK 'file:///var/folders/8y/njkkd6l946z4y9xr1jy3hrnh0000gp/T/exthost-948822.cpuprofile'
  • open the profile inside VS Code, sort by "Self Time" and ensure the top entry matches the first synthetic error, expand the entry and ensure the heaviest call path resembles the "stack trace" of the error. This can repeat for the top 5 entries that are from us.
  • enable trace logging and view the Telemetry Log
  • cause another freeze
  • make sure the unresponsive.sample-events are send
  • make sure PII (NO full files paths!) are send

Screenshot 2022-10-24 at 19 04 49

@roblourens
Copy link
Member

I only get the message about the long task, and never any follow up. And apparently my linux vm in azure is a bit slow. I know the CPU is not extremely fast but 106ms is a bit surprising. Several reloads, never below 100ms. Doesn't seem like there is any other expensive process in the background

2022-10-25 10:56:19.131 [info] [perf] Render performance baseline is 106ms
2022-10-25 10:58:40.260 [warning] [perf] Renderer reported VERY LONG TASK (1386ms), starting profiling session 'b8b0dc0e-2f1b-4839-9134-afff5d0aa0df'
2022-10-25 10:59:04.756 [warning] [perf] Renderer reported VERY LONG TASK (1318ms), starting profiling session 'a31471e9-53d8-4c16-8aa2-c313d08ae051'
2022-10-25 11:02:34.494 [warning] [perf] Renderer reported VERY LONG TASK (1502ms), starting profiling session '004ea2ef-be92-4b00-ac87-90c8d61f1d7a'

@roblourens
Copy link
Member

It looks like the log won't be saved if we don't find anything "interesting" in it. I wonder whether my high baseline is preventing me from seeing the right kinds of long tasks. I've provoked a lot of slowness in vscode but still not seeing anything else

2022-10-25 11:08:34.164 [info] [perf] Render performance baseline is 98ms
2022-10-25 11:09:35.508 [warning] [perf] Renderer reported VERY LONG TASK (1333ms), starting profiling session 'e5c7dfc0-25b2-4eb3-8a1c-94e39c298bbe'
2022-10-25 11:12:40.502 [warning] [perf] Renderer reported VERY LONG TASK (1229ms), starting profiling session '74c7bb14-d7cf-4ae3-b6ec-578082fbb96a'
2022-10-25 11:17:35.557 [warning] {}
2022-10-25 11:17:38.079 [info] UNRESPONSIVE extension host: received responsive event and cancelling profiling session
2022-10-25 11:17:47.139 [warning] [perf] Renderer reported VERY LONG TASK (1832ms), starting profiling session 'f47460ce-28aa-47f3-92ce-d16bcf940242'
2022-10-25 11:17:49.582 [info] UNRESPONSIVE extension host: received responsive event and cancelling profiling session
2022-10-25 11:19:30.813 [warning] [perf] Renderer reported VERY LONG TASK (1121ms), starting profiling session 'e27044a4-6ae3-4388-957b-1a89ae8d880d'
2022-10-25 11:20:07.667 [warning] [perf] Renderer reported VERY LONG TASK (1257ms), starting profiling session '6acebfcd-bc03-4c15-a5e7-ccff282fabfb'

@DonJayamanne
Copy link
Contributor

Same for me as well, re-loaded VS Code a couple of times, changed logging level to trace and still no changes

Note: At one point the base line was 17ms for me and the task took 500ms and still not profile generated.

2022-10-26 09:03:47.797 [info] [perf] Render performance baseline is 33ms
2022-10-26 09:03:55.396 [warning] [perf] Renderer reported VERY LONG TASK (529ms), starting profiling session '0cb26651-5c7a-49c3-93a8-d0fd3826c02b'

@jrieken
Copy link
Member Author

jrieken commented Oct 26, 2022

@roblourens @DonJayamanne please check the main log for eventual errors. Also, force a freeze by running "F1 > Force Retokenize" in checker.ts. Tho, note that this only works when profiling has started before the actual freeze (we cannot profile the renderer when it is already frozen)

@deepak1556
Copy link
Collaborator

Baseline on my linux VM was 42ms and I was able to confirm the profile generation, telemetry events with both snap and deb installs.

@deepak1556 deepak1556 removed their assignment Oct 26, 2022
@alexdima
Copy link
Member

Worked for me too, I could see the profile being generated and the telemetry events.

@roblourens
Copy link
Member

Seems like everyone else verified this part and I'm too dumb to find it, but what do I have to do to get the telemetry log to show up now with the changes to the Log Level? This is with "All" set to "Trace" and there is no specific category for Telemetry.

image

@roblourens roblourens removed their assignment Oct 26, 2022
@DonJayamanne
Copy link
Contributor

Yay, this magically works for me today. However I cannot see the telemetry, will file an issue for that.

@DonJayamanne DonJayamanne removed their assignment Oct 26, 2022
@DonJayamanne DonJayamanne added the verified Verification succeeded label Oct 26, 2022
@DonJayamanne
Copy link
Contributor

Ignore that, I was looking at Extension Telemetry

Verified as working.

@alexdima
Copy link
Member

@roblourens I had telemetry turned off and I could see entries there only after turning it on, reloading and enabling tracing again

@github-actions github-actions bot locked and limited conversation to collaborators Dec 10, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
testplan-item verified Verification succeeded
Projects
None yet
Development

No branches or pull requests

7 participants