Survey: function arguments limit for uncurried calls


#1

Dear users,
We are working on generalizing uncurried calling convention support, for example, get rid of value restriction support so that you can define polymorphic uncurried function, support uncurried function with labels etc.
There is a technical limitation that the function type has to be pre-declared so that the number of function arguments can not be arbitrary. scala used to have similar problems https://contributors.scala-lang.org/t/scalas-22-limitation/1943
There are some work around(non-trivial added complexity) to support more but I wonder if this has a potential real impact on your code if we have such a limitation?
Thanks


#2

Would the limitation affect curried functions as well?


#3

I’m not sure to understand what you mean by pre-declared, does it mean it cannot be an anonymous function?
The real question is indeed to know if it will impact curried functions and hence current code.


#4

@bobzhang: Would it be possible to describe in code how this would work, and what the difference is with the current situation? Currently we also have to have the uncurry declaration on both sides? (which is a bit unfortunate/confusing)

It would be great to to have better inference about functions, and prevent Curry._* calls that happen unnecessarily quite often at the moment. If that means some extra annotation in the function declaration that would be fine in my opinion.


#5

Note it does not have any impact on curried function, it will work as is.

We have uncurried function support but there are serious limitation: you cannot create toplevel polymorphic uncurried function and it does not support labels. We are working on to lift such limitation, the tradeoff is that the technique we solved this issue needs to pre-declare the types (same as scala), since we can only pre-declare limited number of types, so that we can only support limited number of arguments.

Another possibility is that we still support arbitrary number of arguments, for arity less than 22 (suppose we have the same limitation as scala) the limitation is gone but for arity more than 22 the limitation still holds – this brings more complexity but not sure if it is worth to support both


#6

ah you mean something like that:

type f('a) = uncurried((~foo: int, ~bar:'a) => 'a);
[@bs.val] external f: f('a) = "f";
let baz = f(~foo=9, ~bar="bar");

I’m for anything that would simplify the API consumers’ lives, I don’t mind having to write slightly heavier bindings if the usage is easier afterwards.

I work on bindings of a significant size, the improvements that were introduced recently (records as objects, unboxed types) really made my bindings more convenient to use and still with 0 runtime cost, the only bindings I’m not 100% happy with are the ones with uncurried functions or bs.this (which is quite similar and would be solved by the same technique I hope), so I think this would really go in the direction of having idiomatic and performant bindings for everyone.


#7

For code that uses a lot of Belt’s functions with uncurried callbacks (reduceU etc.), would that mean that we would have to annotate all types for the callbacks? Or would those be inferred?