Add reference documentation links to classes like transforms and trainers. (i.e. LightGBM)
Include parameter names in method calls. (i.e. mlContext.Data.TrainTestSplit(data,testFraction: 0.2))
Use real data for examples. It makes it easier to understand the problem that's being solved opposed to ra...
The last code block of the Alien translator notebook lacks clarity.
Since you use the same collection variable before and after the foreach loop, it seems like you want the "student" to modify the collection in-place, which is not possible using a foreach loop.
Since the notebook is aimed at new C#...
The machine learning notebooks reference Microsoft.ML version 2.0.0-preview.22356.1 from a private Azure DevOps server. This preview version of the Microsoft.ML package is not available from nuget.org making it hard to run the notebooks.
PackageManagement Error 3217 Invalid URI: The format of the U...
Have VS2022 installed, installed the Notebook editor extension and just opened the getting started folder and getting the following
I was able to build C# Projects with my VS2022 and also C# Interactive windows work in VS2022
Kernel Failed To Start.
Cannot find a tool in the manifest file that has...
Are the ML.NET notebooks using non-public preview nugets?
I tried to run the first ML.NET notebook 01- Intro to ... I get the following error:
#r "nuget: Microsoft.ML, 2.0.0-preview.22356.1"
C:\Users\nicho\AppData\Local\Temp\nuget\20332--ab38aca3-7c58-4abd-98f6-0079cd1c2c87\Project.fsproj : error NU...
Just noticed a small inconsistency, not sure if it's intentional.
When I click on C# 101 notebook links, they open .net interactive in vs code (with python set ast the kernel). When I click the ML.NET notebooks they open in visual studio 2022.
The Training and AutoML notebook is able to consume a lot of memory, causing to hang or crash other processes.
Strangely enough, it usually works fine if you run the notebook only once. So to reproduce the problem, you should:
Open Windows Task Manager, and check your memory usage
Open Training and...