"Evaluating Large Language Models Trained on Code", arxiv.org/abs/2107.03374
A detailed description of early (pre-GitHub Copilot) versions of OpenAI Codex. This is the "paper of the year" so far: we finally have real progress in AI-assisted computer programming (and difficulties of computer programming form the key bottleneck limiting the speed of progress).
See comments in dmm.dreamwidth.org/44860.html for details.
JuliaCon 2021 starts on July 20 with 8 days of workshops followed by 3 days of main conference. JuliaCon 2020 was great, this is likely to be even better.
This is a fully virtual conference for the second year in a row; the registration is free and needed to access interactive features, poster sessions, and such, but the bulk of materials will be accessible via YouTube without registration. I created a post dmm.dreamwidth.org/46160.html with links and I'll keep populating it with various comments as the conference progresses.
Cross-post: anhinga-anhinga.livejournal.com/85003.html
A detailed description of early (pre-GitHub Copilot) versions of OpenAI Codex. This is the "paper of the year" so far: we finally have real progress in AI-assisted computer programming (and difficulties of computer programming form the key bottleneck limiting the speed of progress).
See comments in dmm.dreamwidth.org/44860.html for details.
JuliaCon 2021 starts on July 20 with 8 days of workshops followed by 3 days of main conference. JuliaCon 2020 was great, this is likely to be even better.
This is a fully virtual conference for the second year in a row; the registration is free and needed to access interactive features, poster sessions, and such, but the bulk of materials will be accessible via YouTube without registration. I created a post dmm.dreamwidth.org/46160.html with links and I'll keep populating it with various comments as the conference progresses.
Cross-post: anhinga-anhinga.livejournal.com/85003.html