Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Package references in F# scripting #542

Closed
ctaggart opened this issue Feb 22, 2017 · 100 comments
Closed

Package references in F# scripting #542

ctaggart opened this issue Feb 22, 2017 · 100 comments

Comments

@ctaggart
Copy link

ctaggart commented Feb 22, 2017

[ edited by @dsyme to be a more comprehensive guide to this design question ]

[ Latest implementation is here: https://github.com/dotnet/fsharp/pull/4042 ]

Package references in F# scripts

There has been a long standing desire to add the ability to reference nuget packages from F# scripts. Originally this was conceived as basic fully qualified nuget references like packages.config. Lately this has evolved into integrating "dotnet" or "paket" or "npm" package specifications and dependency management tools as part of the toolchain, or providing generic hooks to allow this.

Related links

Design principles

  1. You can add a package reference to a script with a single line using a normal text editor that supports F# - no extra files (e.g. a packages.config) are needed

  2. Package references include version constraints, and dependency resolution is performed a.la. nuget v3 and/or paket

  3. it works "at design time" , i.e. I can open a script containing package specifications and quickly get editing and type checking against a resolved set of packages - without needing to run any of the script or any command line tools

  4. it works universally, i.e. it works on any default install of F# editing + scripting, whether Ionide or the Visual F# Tools or web-hosted delivery of F# scripting such as Azure Functions or Azure Notebooks.

  5. It aligns well with how package management will be dealt with in the future of the .NET toolchain

  6. it works "the same way" on both .NET Framework/Mono and .NET Core, at least to some approximation,.

  7. It considers the needs of F# when used as a Javascript programming language through toolchains such as Fable or WebSharper. Here NPM is a natural package manager, though there are others.

  8. The design and implementation do not induce a dependency on any one specific package manager within the core F# toolchain (i.e. compiler/scripting/editor/Fsharp.Compiler.Service/ProjectCracker). It might be that different package managers have some specific support to make them work, but we remain open to new package managers.

  9. The implementation does not induce bad "layering problems" in the F# toolchain implementation reminiscent of MSBuild, see below.

  10. It works efficiently - package resolution is amortized, for example.

  11. The default settings in editing and execution tools are sufficiently space-efficient, sharing packages between scripts if needed to achieve this.

Possibiity. "dotnet" references in scripts

The way the new dotnet core tooling loads nuget packages and their assemblies is awesome! I've been using its extensibility to build a DotnetCliTool. All the dependencies are downloaded & loaded from the single %userprofile%\.nuget\packages directory. I would like to be able to use this same mechanism from scripts. I would like the same types of reference to be supported, package references and project references.

Instead of having to use a nuget client to download the package and then reference the assembly like:

#r "../../packages/Microsoft.Azure.WebJobs.2.0.0-beta2-10515/lib/net45/Microsoft.Azure.WebJobs.Host.dll"

I want to be able to do:

#package "Microsoft.Azure.WebJobs@2.0.0-*"

Possibiity. #project "my.fsproj" references in scripts

People have suggested that project references should work the same way as package references to work in Visual Studio 2017 with the dotnet core tooling. e.g. be able to do:

#project "my.fsproj"

Possibiity. #r "paket: Foo.Bar.dll" references in scripts

The experiment dotnet/fsharp#2483 contains a prototype of integrating paket package management directly into the F# programming model for .NET Framework programming. This includes design-time support. The experiment violates one of the design principles above - it "bakes in" support for Paket only. However that support could be factored into either fsi.exe.

Possibility. #r "packages" references in scripts

People have suggested that a script simply referencing #r "packages" should implicitly pick up the packages from its context, e.g. the containing packages.config or solution or paket.dependencies in the toolchain. In the Experiment adding Paket support to FSI this is #r "paket"

Possibility. implicit generation of load scripts

Paket has a feature to accurately generate .fsx and .csx load scripts containing the #r references suitable for use with F# and C# scripting. This feature is a natural and simple way to integrate package resolution - simply have the package manager resolve the packages, generate the scripts and load the scripts

Possibility. sharing packages and package caches

A major question is where packages are cached. This is primarily a responsibility of the package manager, but becomes a serious issue for scripts because packages can't be duplicated for 100s of scripts, so some shared caching is needed.

Possibility. align with C#

It is possible we should approach this in a similar way that C# repl and scripting want to approach the problem for .NET Core. There are shared concerns here and it is likely we should move forward with the same model for how assemblies and assembly versions are loaded in a scripting session.

Given that C# and F# scripting are also used for Azure Functions, it seems to me that a shared behavior here would be best. However this would mean spec'ing out that behavior all-up and perhaps building out an underlying component that F# and C# could sit atop.

ScriptCS also wants to align with a compatible mechanism: #542 (comment)

Possibility. allow expression of SemVer version constraints

Paket and dotnet both have ways of specifying version constraints. The abiility to include these prior to package resolution is important.

#542 (comment)

Possibility. autocomplete and search

Auto-completing package names gives a great way to search and discover package functionality.

Basic autocomplete is possible already in package specifications like packages.config and paket.dependencies. e.g. see autocomplete in Ionide

Additionally auto-completing on search terms such as #r "package: statistics giving search of package description text would be helpful.

Challenges

Some things make this tricky for F#

Challenge: Compiler architectural layering (basics)

The basic "natural" layering of the toolchain is

Parsing --> Checking --> CodeGen --> FSC .EXE

Parsing --> Checking --> CodeGen --> Scripting --> [AssemblyResolution]--> FSI.EXE

Here [AssemblyResolution] is a plugin-point or the mechanism used to resolve assemblies.

Note that in this architecture the F# editing tools support the F# scripting programming model via FSharp.Compiler.Service.dll.

Parsing --> Checking --> CodeGen --> Scripting --> [AssemblyResolution] --> FSharp.Compiler.Service.dll --> editing tools and scripting hosts

This means that as things stand today the implementation of F# scripting is not "a tool on top of F#" (ala scriptcs) but actually part of FSharp.Compiler.Service.dll and thus pretty much universally supported in F# editors. The allows us to deliver scripting into a very wide range of contexts simply, efficiently and consistently, e.g. into Ionide, Azure Functions, Azure Notebooks and many online tools.

With integrated package resolution one option is that this becomes

Parsing --> Checking --> CodeGen --> FSC --> Scripting --> [AssemblyResolution] --> [PackageResolution] --> FSI.EXE

where [PackageResolution] indicates a potential plug-in point. An alternatives would be to build a layer "outside and on top" of FSI.EXE

Parsing --> Checking --> CodeGen --> FSC --> Scripting --> FSI.EXE -->[AssemblyResolution] --> [PackageResolution]  --> ACTUAL-FSI-TOOL

However, this approach doesn't seem to work particularly well with the incremental addition of package specifications in a REPL session.

Challenge: Compiling scripts and architectural layering

Traditionally the tool fsc.exe has supported the ability to compile F# scripts including their references. This has induced a violation of layering in the toolchain: the FSC.EXE tool also included the logic for assembly resolution and quite a lot of logic for processing F# files as scripts. This meant that the FSC.EXE tool and FSharp.Compiler.dll became badly dependent on the MSBuild API simply to resolve assembly references in scripts.

Worse still this "leaked out" into the logical specification of the compiler itself. The compiler was now able to accept strange assembly specifications such as -r:System. FooBar,Version=3.2.10,.. on the command line and resolve them with MSBuild. In the original world of .NET MSBuild shipped as part of the .NET Framework. However when MSBuid became separated this started to cause immense pain, and even more so when ,.NET Core came about. This problem was poisonous to whole toolchains built on FSharp.Compiler.Service, and we only recently did the hard work to "extract the rotten tooth" and make MSBuild optional

Challenge: .NET Core toolchain

The .NET Core toolchain changes some things about how F# compilation is surfaced. In particular it adds a layer dotnet ... via which all tools are accessed from the command line.. This means that with .NET Core the architectural layering becomes something like this:

Parsing --> Checking --> CodeGen --> FSC.EXE --> dotnet-fsc

Parsing --> Checking --> CodeGen --> Scripting --> [AssemblyReferences] --> FSI.EXE --> dotnet-fsi

The question is where [PackageResolution] happens in this toolchain. But first there are other questions that need to be resolved:

  • Not all variations of F# running on .NET Core go through the dotnet tool. For example, clients of the FSharp.Compiler.Service.dll running on .NET Core such as the Fable compiler do not - they just embed F# parsing and checking directly via the compiler service DLL.

  • Fable assumes an installation of "dotnet". However will it be the case that all clients of F# scripting (editors, engines etc.) can assume an installation of "dotnet"? If I embed F# scripting in an application, do I assume "dotnet" is installed? Do I have to download reference assembly packages from nuget to do simple F# script execution?

  • It is not clear what the specification of the F# .NET Core scripting engine will with respect to loading "uplevel" versions of the assemblies that are used to implement the scripting tool itself. This is discussed below.

It seems that these questions need to be resolved before integrated package management can be addressed.

@smoothdeveloper
Copy link
Contributor

@ctaggart I don't know if you encountered paket / generate-load-scripts?

https://fsprojects.github.io/Paket/paket-generate-load-scripts.html

Right now it doesn't leverage the new location for nuget, but maybe it will in future, when this is the case, this could be handled by having the generated scripts to load the assemblies in that location instead.

I'm not sure we should bake such functionality in compiler / fsi already, I think it opens door to many cans of worms, there was discussions for a #nuget directive or a way to add assembly resolution handlers.

@smoothdeveloper
Copy link
Contributor

smoothdeveloper commented Feb 22, 2017

One problem I see with #package "name-version" is that a set of scripts might end up using different versions / you need to maintain the version in many places consistently, there are questions about transitive dependencies, etc.

Nuget or Paket (as a tool used outside the script) I believe help to handle those problems.

I'd be favorable to this proposal if there was a clearly defined way to register custom resolution of those #package directives.

@dsyme
Copy link
Collaborator

dsyme commented Feb 22, 2017

I'd like to hear what @KevinRansom and @forki and others think about this

We need to do this in some form. I would like to keep the ability to integrate with Paket too, since I still believe it brings a lot of value. I would also allow packages to have their own load scripts as part of the package rather than just referencing the DLLs

I'm anxious not to tie ourselves too closely to aspects of packaging that might change, but I understand the huge value of being able to reference packages like this.

@dsyme
Copy link
Collaborator

dsyme commented Feb 22, 2017

@ctaggart If you would like to prototype something as a fork of Microsoft/visualfsharp I'd be really glad to see it, just at least to get a feel for what the implementation would look like. It's probably not a huge amount off code to add if we're just referencing simple packages without transitive dependencies.

You will have to consider package sources besides nuget.org I suppose.

@forki
Copy link
Member

forki commented Feb 22, 2017 via email

@KevinRansom
Copy link

KevinRansom commented Feb 22, 2017

@dsyme, @ctaggart.

Yes the dotnetcli has an interesting way of managing references. And it is essential that we integrate FSI with it if we want a decent coreclr / fsi story.

I struggle a lot with figuring out what makes a good design for this though ... especially since dotnet cli, no longer does project json, but instead uses PackageReferences in a .XXproj file.

allowing #r to reference packages at run-time seems like an essential feature (it doesn't matter what the keyword looks like). However, there are a ton of issues with that ... E.g. what if you are running with .net 2.0 and using System.Runtime 4.1.0.0 and the package you reference depends on System.Runtime 4.1.1.0.

Do you:

  1. Fail
  2. Restore everything again and restart with the new dependencies ...
  3. Something else ...

Given that dotnet cli is dependent on msbuild, do we have a project file ... If we use a project file, then I expect it would be possible to plug packet in by munging target files, it really depends what role is desired for the tool.

If we don't want to go with a project then the plug point would need to be better developed. I have not followed packet in the nugget 3.0 world, so I don't know how it fits, others could speak to that.

The other thing is how do we want FSC to behave, the FSC command line on coreclr is very full and some of this stuff could simplify things.

I desperately want to make time to think this through ... and events keep getting in my way. Which makes me quite sad and very frustrated.

@dsyme
Copy link
Collaborator

dsyme commented Feb 23, 2017

I must admit it feels we are very close to working answers for these things for Paket, it's just not integrated into FSI.EXE and the various editing tools. We already use a non-integrated manual two-phased version of it in Azure Notebooks, e.g. if you look at the samples here https://notebooks.azure.com/dsyme/libraries/fsharp-templates.

An FSI-integrated version of this would, at a logical level, keep an incremental paket.dependencies script. A complete script would look like this:

#r "paket: nuget FSharp.Data"
#r "paket: nuget XPlot.Plotly"

open XPlot.Plotly
Chart.Line [ 1 .. 10 ]

The spec would be that each line

#r "paket: some-text"

implicitly adds that text to a script-specific paket.dependencies (not an actual file, just implicit). Just like the existing #r DLL references, these would, at design-time, be collected and resolved (using the existing Paket.Core API, which is very simple), prior to assembly resolution. As part of this we would regenerate the paket-files/load.fsx script (that's the exact name of the script) and that would then be provided back to the compiler service to analyze for DLL references and for execution at runtime. The Paket framework setting would default to whatever we are using by default in FSI.EXE e.g. net461 and the source setting would default to nuget.org unless otherwise specified.

The only really hard things are the decisions up to Paket: (a) where the package cache resides (b) if the packages are version qualified and (c) if the packages are collected when no longer used. For Paket the default is normally to install into the packages directory under the script without version numbers in the paths, and for Paket to manage collection of the packages when no longer needed. Paket likely has settings to adjust these things and it would just be a matter of choosing the right default policy.

I do wonder if we (the F# community) should just forge ahead and trial adding this functionality directly to the Visual F# Tools, based on some fixed version of Paket via a a PR for people to try out. If we did I reckon we would iterate very quickly to something that is very usable for .NET Framework scripting (and after all 100% of F# scripting is .NET Fx today). We could also then iterate on .NET Core scripting.

This would, however, mean shipping a fixed version of Paket.Core.dll as part of the Visual F# Tools (really alongside FSI.EXE) for use in the F# scripting model. I don't think that would be a bad thing, but if we wanted to make that more explicit we could do this:

#r "/some/other/paket.dll"
#r "paket: nuget FSharp.Data"
#r "paket: nuget XPlot.Plotly"

open XPlot.Plotly
Chart.Line [ 1 .. 10 ]

where the first line loads the package manager and conforms to some interface. Subsequent lines would then pass off the information to the named package manager - like type providers this has to happen at design-time too, i.e. in the IDE. But the API would be fixed in stone.

It's possible the nuget should be implicit for the common case, so

#r "paket: FSharp.Data"
#r "paket: XPlot.Plotly"

To be honest I'd love to see a prototype of this for the VIsual F# Tools, and if it got integrated I would use it all the time. I'm sure someone would add the IDE feature to give autocomplete for package names too on #r "paket: $$$", and we'd also see the feature integrated into FSharp.Compiler.Service and all of Ionide etc. If we started today and let the community ( @forki :) ) loose on it I reckon a full prototype would probably be done in.... well.. these guys are just so fast.....

The problem is that I feel like the dotnet version of the story (which definitely needs to happen) is somehow holding us back - the churn last year in project.json means we're waiting for that story to stabilize, in conjunction with stabilizing F# .NET Core scripting and so on. Of course, we eventually need a full #r "dotnet: ..." too and for both to sit alongside each other and be fully companionable. But I wonder if it might not be better to just forge ahead with a #r "paket: ..." feature and use it as a forcing function to work out what the #r "dotnet: " story looks like, while equally landing a great feature for F# and the Visual F# Tools

@smoothdeveloper
Copy link
Contributor

The problem is that I feel like the dotnet version of the story (which definitely needs to happen) is somehow holding us back - the churn last year in project.json means we're waiting for that story to stabilize, in conjunction with stabilizing F# .NET Core scripting and so on.

@forki will confirm, but the aim (and next paket release is there or almost there) is to have a "native-like" experience with paket on dotnetcore (integrated with dotnet restore / dotnet build), also for now there is no scripting story in dotnetcore as the project.json/*proj files are for compiling / packaging assemblies, but not used in context of scripting.

@dsyme
Copy link
Collaborator

dsyme commented Feb 23, 2017

As an aside, one can also imagine #r "npm: ..." being very useful in Fable scripting (i.e. for F# scripts compiled to JavaScript using Fable).

@KevinRansom
Copy link

KevinRansom commented Feb 23, 2017 via email

@smoothdeveloper
Copy link
Contributor

@ctaggart going back to your suggestion, your main concern is relying on the shared packages folder, rather than local packages folder which often is a waste of space.

Assuming paket generate-load-scripts would rely on the shared packages folder, would that be satisfactory enough or you believe integration in the language (as @dsyme & others are sketching) is the way to go?

I think for that to happen, it would be necessary to have a cross platform way to resolve that shared packages location, %userprofile%\.nuget\packages seems Windows specific, and we currently can't bake that in #r directives being generated.

@baronfel
Copy link
Contributor

There is also some discussion of using symlinks in Paket for the local packages folder, which would perhaps resolve some of the concern around copying files around. Discussion at fsprojects/Paket#2141

@dsyme
Copy link
Collaborator

dsyme commented Feb 23, 2017

going back to your suggestion, your main concern is relying on the shared packages folder, rather than local packages folder which often is a waste of space.

@ctaggart @smoothdeveloper I think we should treat this as orthogonal to the basic pressing need to add paket/dotnet/npm package manager support into the F# scripting model. Each of these package managers can, if they want, allow the specification of the policy w.r.t. package location, cross-script-sharing and collection. The defaults should also be up to each package manager.

There is no one "right" solution regarding sharing packages v. xcopy packages, there are just mechanisms, tradeoffs and defaults.

@Rickasaurus
Copy link

Seems to me like this begs for a FSI pragma plugin architecture. Could maybe be as simple as a function (single member interface) that takes in a string (parameters to the pragma) and an object that lets you feed commands to FSI as strings.

Forgive me if this is difficult due to the current architecture, it's been quite a while since I've played with the compiler service.

@Krzysztof-Cieslak
Copy link

Krzysztof-Cieslak commented Feb 23, 2017

I think that editor part is pretty easy. I don't know Paket API well enough to make it work in memory (right now I depend on physical paket.dependencies and paket.lock) but with @forki's help I'm sure we can do it

paketscriptreferences

[Obviously that's just super ugly hack]

@forki
Copy link
Member

forki commented Feb 24, 2017

Speaking of ugly hacks:

plotly

@forki
Copy link
Member

forki commented Feb 24, 2017

see dotnet/fsharp#2483 for WIP progress

@enricosada
Copy link

Seems to me like this begs for a FSI pragma plugin architecture

@Rickasaurus maybe we just need to support uri in #r, and add something to register additionals scheme handler. Extend should be done maybe invoking fsi. functions?

  • paket: nuget: XPlot.Plotly
  • nuget: XPlot.Plotly/1.0.* (resolved with nuget rules, like the PackageReferences)
  • the default .\prova.dll (without scheme), can be translated as file://./prova.dll

Doing the dotnet: (please call it nuget: ) is not that big issue, an csproj/fsproj can be used to just resolve <PackageReferences as quick hack.

@dsyme i see only two big issues:

short term: transitive deps between packages

With single packages (#r "paket: mypkg"), works ok.
But all #r are processed one after another.
There should be also something to resolve multiple packages together, because transitive deps between packages are not explicit (like dll), to resolve compatibile packages.

That is an issue with nuget <PackageReference style too, because first all package ref are evaluated, and after real package version are resolved.

Maybe can be easier for medium term to:

  • fsi script include (embeeded or best downloaded from nuget for extensibility) the paket package.

  • boostrapping from nuget package is important, for exntesibility and easier future changes (that's why nupkg are good)

  • that package register to fsi variabiale, so fsi.packageManager is paket handler

  • the #r "nupkg: <string>" alias to fsi.packageManager.Resolve("<string>")

  • but is also possibile to do

    fsi.packageManager.Resolve
         [ "nuget: NETStandard = 1.6"
           "nuget: Newtonsoft.Json ~ 9.0"
           "nuget: Plotly" ]

long term: target framework versioning

ok, netstandard2.0 is going to help to have shared bcl api.
but is just an api specification, based on package.
Maybe i want to use the full .net core api, or the full net46 api in the script. That's not going to help much. Also some packages will be compatibile only with netcore (to use specific functionalities, because innovation will be done in .net core runtime), same for .net full, or mono.
The csproj/fsproj is slim because implicit include the .NETStandard1.6 package, but dependency exists. but can be disabled to include other versions.
Same for runtime.
Maybe we should discuss that in another github issue.

@forki
Copy link
Member

forki commented Feb 24, 2017

regarding resolution of multiple packages. I already have that in the WIP PR. It will resolve on the first expression that is not #r. So go ahead and use as many packages as you want.
It also references transitives via the generated load scripts. so that should work as well.

@dsyme
Copy link
Collaborator

dsyme commented Feb 24, 2017

the #r "nupkg: " alias to fsi.packageManager.Resolve("")

@enricosada In all this, it's very important to remember that we need to trigger package resolution, access generated package load scripts and access resolved DLLs at design time, i.e. when type checking the script in the editors, not just at runtime in FSI.EXE. This means that this stuff is static not just dynamic and that the #r are static declarations like the existing #r and #load

I'll work on a prototype of taking @forki's dynamic FSI.EXE work and playing it instead as part of the design time logic.

(Note, because package resolution can be lengthy and involve disk space etc. we may need to add a confirmation dialog in some editing tools before we download masses of packages. TBD )

@Krzysztof-Cieslak
Copy link

I'll work on a prototype of taking @forki's dynamic FSI.EXE work and playing it instead as part of the design time logic.

My ugly hack injects additional -r: path/to/dll/resolved/by/paket.dll to FSharpProjectOptions after calling FCS' GetProjectOptionsFromScript

@forki
Copy link
Member

forki commented Feb 24, 2017

Ok now can just use the paket files that are already present in a repo:

plotly2

If it can't find that then we fall back to temp path. Next step: locate the paket.exe in such a fallback situation

@isaacabraham
Copy link

Taken from the PR and copied here...

Any reason why we just don't go for a #paket node rather than something explicitly coupled to #r? I suppose I could see #load being used in the same way (for Paket to read from GitHub etc.) but maybe putting it right at the top level could open up possibilities for adding features to F# scripts aside from just reference and load?

@matthid
Copy link

matthid commented Feb 24, 2017

While I agree with most of what has been said already I'd like to throw some additional ideas around:

  • It's not clear if this needs to be in the core. I already experimented with adding something similar to FAKE (https://github.com/matthid/FAKE/blob/coreclr/help/fake-dotnetcore.md#paket-inline)
  • Imho additionally to specifying dependencies "inline" (as suggested here), we should be able to reference a paket dependencies file as well. This might help in large projects to be able to specify your dependencies globally (the core idea of paket).

@forki
Copy link
Member

forki commented Feb 24, 2017 via email

@dsyme
Copy link
Collaborator

dsyme commented Feb 25, 2017

Back to @KevinRansom's question:

E.g. what if you are running with .net 2.0 and using System.Runtime 4.1.0.0 and the package you reference depends on System.Runtime 4.1.1.0.

Right now F# Interactive just redirects this kind of #r to the loaded version, and MissingMethodExcception can happen.

If doing proper package resolution then you should surely just fail or warn, hopefully with a nice message that your script runner should run a later version. That's on the assumption that there's really not technical possibility to run to different versions of System.Runtime in the same process.

Alternativey you could have dotnet fsi (or whatever the final way of doing F# scripting is on /NET Core) do the package resolution and pass the right arguments in prior to actually starting fsi.dll. On .NET Framework you'd need to put a package resolver in front of fsi.exe, though the problem is less severe there since there are fewer DLLs and less version churn.

It's also absolutely critical that this stuff works at design time (so you don't need to run the script to get proper editing of the script against its resolved packages).

@dsyme
Copy link
Collaborator

dsyme commented Feb 25, 2017

@matthid

It's not clear if this needs to be in the core. ...

I'm only interested in a language/tooling-integrated package referencing feature if

  1. it works "at design time" , i.e. I can open a script containing the text below (or some non-paket variation of it) - without running any command line tools - and quickly get editing and type checking against a resolved set of packages - without needing to run any of the script
#r "paket: nuget FSharp.Data"
#r "paket: nuget XPlot.Plotly"

open XPlot.Plotly
Chart.Line [ 1 .. 10 ]
  1. it works universally, i.e. it works on any default install of F# editing + scripting

  2. it works the same way on both .NET Framework/Mono and .NET Core, at least to some approximation, and perhaps also Fable, though the sets of packages would differ in each case

These sorts of things are the "magic" that would really drive F# scripting a long way.

I'd like to see a set of design principles like this (in addition to the experiments :) )

@jwosty
Copy link
Contributor

jwosty commented Aug 2, 2018

Ok, thanks. Just want to say I really appreciate all the stuff you guys are doing! Keep up the awesome work! :)

@smoothdeveloper
Copy link
Contributor

@matthid to add to what @KevinRansom was saying, I think for the example you mention, we would rely on package manager to fail with a stable error message / error number.

This would allow VS or other editors to detect well known error and provide a code fix / way for user to install dependency.

If you think more is needed, you might want to propose additional design of things that could be made available in the PackageManager assembly and could be used, but I think it is a can of worm to have that happening inside the FSI process rather than propagating errors handled by the tooling.

@matthid
Copy link

matthid commented Aug 2, 2018

If you think more is needed.

Yes indeed. I think error scenarios are almost missing from the spec. In particular we need to design/decide:

  • What is the best way for a extension to report errors it already knows?
  • How do we report errors if the script generated by the extension has errors (we shouldn't report regular FS errors imho but extend them in some way and also consider continuing the evaluation (see PR comment regarding breaking change and @dsyme comment at the same place!)
  • What if loading the extension-dll fails?
  • Any other error scenarios?

I think we should look at all of those from a users and a extension developers perspective and specify those cases a bit. We should learn from type-providers where errors a lot of times are useless for the user AND the developer (which is really the worst of both worlds)

My suggestions would be:

  • Specify some common error scenarios which can be used by the extension (reason against it is that it will always be incomplete)
  • Decide what a user sees by default on an unhandled exception (ie only the message)
  • Depending on the previous: Decide how a developer can see the full stack-trace of an unhandled exception without debugging the compiler, for example by setting an environment variable or a compiler switch
  • We should continue on errors and only emit a warning, we do the same today if the #r is invalid
  • Ask @Krzysztof-Cieslak regarding what would be the best for tooling, as I have no idea if my suggestions are actually useful in that regard.

@alfonsogarciacaro
Copy link

@matthid Sorry, I haven't followed the discussion. Fable just needs the list of arguments that must be passed to the F# compiler. In the case of dependencies, this means resolving the actual .dll references (like -r:/Users/Anne/.nuget/packages/Fable.Core.2.0.0/lib/netstandard2.0/Fable.Core.dll). This is what Dotnet.ProjInfo does for us, and what FSharpChecker.GetProjectOptionsFromScript used to do too (currently disabled precisely because it's difficult to add nuget packages to a script and it's not useful to program Fable if you don't have access to Fable.Core).

Not sure how this is supposed to work, but if .GetProjectOptionsFromScript works the same shouldn't be a problem for Fable.

@matthid
Copy link

matthid commented Aug 2, 2018

@alfonsogarciacaro I was more speaking if the #r "manager:" extension specified here will work in the fable repl for example. I'm not sure the current reflection based code would do something useful in the browser?

@alfonsogarciacaro
Copy link

alfonsogarciacaro commented Aug 2, 2018

No, forget about adding dependencies to the online repl, we cannot even have multiple files there 😉

We do have some dependencies in the online repl but preparing them is a somewhat complicated process, must be done before hand and it has limitations (only metadata can be read).

@matthid
Copy link

matthid commented Aug 2, 2018

Ok, so probably a fun vacation project :P.
Anyway, I guess that means we can ignore fable for now and use #if once we know how to do it...

@reinux
Copy link

reinux commented Feb 11, 2019

Any progress on this problem?

@cartermp
Copy link
Member

@reinux Yes, you can follow the PR here: dotnet/fsharp#5850

It's temporarily on hold so that we can ship what we have with FSI and F# 4.6

@charlesroddie
Copy link

charlesroddie commented Apr 10, 2019

I think in order for FSI to be useful in larger projects there should be better integration with the project system. This may not be possible with .fsx files which don't appear to work inside projects (don't appear to see definitions in previous files in a project or see definitions in referenced projects).

For example:

File0.fs and File1.fs live inside Project1 which references Project0.
Right click on File1.fs or within File1.fs -> Load context to FSI. This:

  • builds Project0, and references Project0.dll in FSI
  • loads code from File0.fs into FSI
  • if there is a cursor, loads code up to the cursor into FSI

Is this request separate enough from this dicsussion that it should go in a separate suggestion?

@cartermp
Copy link
Member

Correct, .fsx files effectively float in space. What you're looking for is something akin to "generate project context" in the FSI instance, which could be a tooling feature but it would be great if there was also an FSI mechanism for if you just launched the REPL in a project's directory.

@toburger
Copy link

Any progress here?

@cartermp
Copy link
Member

Nope.

@SchlenkR
Copy link

So the „#r nuget“ in Jupyter will only be possible in Jupyter (since it’s a kernel feature) or how is that related to this issue here?

@abelbraaksma
Copy link

Right click on File1.fs or within File1.fs -> Load context to FSI.
Is this request separate enough from this dicsussion that it should go in a separate suggestion?

@charlesroddie This would be a great feature to have, but being buried in this thread probably doesn't help giving it traction. Back in VS 2015 there was an extension creating #r references of all references in a project, dumped into a script file. That already made this process much simpler.

But I agree, a single 'send context to FSI' would just be awesome.

@cartermp
Copy link
Member

@ronaldschlenker correct, until .NET 5, #r nuget will only work in the jupyter kernel (and previews of .NET 5). You try it out here: https://mybinder.org/v2/gh/dotnet/try/master

Closing as this has been implemented.

@SchlenkR
Copy link

Just for interest (if you have time to answer): why .Net5? Will there be a unified scripting base with C# or what’s the deal here?

@cartermp
Copy link
Member

.NET 5 is the next release train for F#, and since this is in preview today, that is the next available release for the feature (same goes with nameof and opening of static classes).

@ErikSchierboom
Copy link
Contributor

I would love for the #project "my.fsproj support to be added. When doing Haskell and GHCi, I always use its "open an interactive and load the current project's source code" feature. So productive and a brilliant way to explore what functionality you've written.

@toburger
Copy link

I would love for the #project "my.fsproj support to be added. When doing Haskell and GHCi, I always use its "open an interactive and load the current project's source code" feature. So productive and a brilliant way to explore what functionality you've written.

This would be awesome! 💯
As far as I understand the resolution mechanism for those references will be extendable.

The resolution mechanism will work with a prefix like nuget (at the moment the only implementation) or paket.

#r "project: my.fsproj"

Please correct me if I am wrong.

@toburger
Copy link

toburger commented Mar 20, 2020

In my imagination a possible project reference could be:

#r "project: my.fsproj"

Which executes the following steps when evaluated (.NET Core with NuGet references):

  • dotnet restore
  • dotnet build (someday with incremental recompilation)
  • Reference nuget packages (like #r "nuget: foo" but should respect the version provided in the fsproj file)
  • Reference the built project assembly
  • Register assemblies for editor tooling

@Krzysztof-Cieslak
Copy link

I've worked on something during today's live stream - https://github.com/Krzysztof-Cieslak/DependencyManager.FsProj

@cartermp
Copy link
Member

@ErikSchierboom @toburger @Krzysztof-Cieslak can you file an issue on dotnet/fsharp tracking this as a feature request? I think it's perfectly reasonable and @Krzysztof-Cieslak has shown a possible way forward.

@toburger
Copy link

toburger commented Mar 21, 2020

Valid point! I've created the issue here: dotnet/fsharp#8764

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests