spankalee 3 days ago

    import {sum} from './sum.js' with {type: 'comptime'};
is an unfortunate abuse of the `type` import attribute. `type` is the one spec-defined attribute and it's supposed to correspond to the mime-type of the imported module, thus the two web platform supported types are "json" and "css". The mime-type of the imported file in this case is still `application/javascript`, so if this module had a type it would be "js".

It would have been better to choose a different import attribute altogether.

  • alpinisme 3 days ago

    You’re projecting the mimetype idea from two examples but the proposal is intentionally agnostic about what type might be used for:

    > This proposal does not specify behavior for any particular attribute key or value. The JSON modules proposal will specify that type: "json" must be interpreted as a JSON module, and will specify common semantics for doing so. It is expected the type attribute will be leveraged to support additional module types in future TC39 proposals as well as by hosts.

apatheticonion 3 days ago

I literally just want Rust style macros and proc macros in JavaScript. e.g. using

``` const MyComponent = () => jsx!(<div></div>) ```

rather than a .tsx file.

That or wasm to be usable so I can just write my web apps in Rust

  • krukah 3 days ago

    Maybe the (relative) lack of ecosystem has kept you away, but I really recommend checking out both Dioxus and Leptos. Leptos is incredibly similar to React, but with Rust ergonomics, and it's been a pleasure to learn and use. With an LLM by my side that knows React and Rust pretty well, I've found myself not even needing the React libraries that I thought I would, since I can easily build on the fly the features/components I actually need.

    I too, eventually gave up on React <> WASM <> Rust but I was able to port all my existing React over into Leptos in a few hours.

    • apatheticonion a day ago

      Yeah they are great, it's more the poor integration and lack of parallelism that makes it not worthwhile.

      Thunking everything through JavaScript and not being able to take advantage of fearless concurrency severely restrict the use-cases. May as well just use TypeScript and React at that point

  • sriku 3 days ago

    The bun and other authors would probably do well to not repurpose already understood terminology. "Macros" are already understood to be code that produces other code. "Comptime" is a nice alternative, but bun's "macros" aren't macros in that sense.

    We had sweet-js macros as a library many years ago but it looks like it went nowhere, especially after an incompatible rewrite that (afaik) remains broken for even basic cases. (Caveat: been a while since I looked at it)

  • JoelMcCracken 3 days ago

    Every once in a while I get a strong urge to hack on sweet.js to add typescript support

  • alpinisme 3 days ago

    That particular example is odd. What are you gaining by having a macro that needs a compile step vs no macro and just configuring your compile step to use a JSX loader for js files?

    • trgwii 3 days ago

      The general idea is something like prebaking computation into your deployed JS/TS. This is much more general than JSX-related tools, and a lot cheaper to run. In JS applications I often find myself doing various small bits of work on startup, comptime.ts would move all these bits into build-time.

      • alpinisme 3 days ago

        Oh, I get the value of comptime! I was specifically responding to the rust-like macros comment

        • apatheticonion a day ago

          There are quite a lot of valid use cases to being able to transform arbitrary tokens into JavaScript at "compile" time. One that already exists is JSX, which is a macro that is baked into the TypeScript compiler but is restricted/tailored to React-style libraries.

          We sort of get around this today using template literals and eval, but it's janky. https://github.com/developit/htm

          A generic macro system could open the door to a framework like Svelte, Angular, Vue, etc being able to embed their template compilers (with LSP support) without wrapper compilers and IDE extensions.

          e.g. imagine syntax like this being possible (not saying it's good)

          ```

          export class MyComponent {

            template = Vue.template!(<div>{{ this.foo }}</div>)
          
            #[Vue.reactive]
            foo = 'Hello World'
          
            constructor() { setTimeout(() => this.foo = 'Updated', 1000) }
          }

          svelte.init(MyComponent, document.body)

          ```

          Where the `template!` macro instructs the engine how to translate the tokens into their JavaScript syntax and the `#[reactive]` macro converts the class member into a getter/setter that triggers a re-render calculation.

          It would need to be adopted by TC39 of course and the expectation would be that, if provided at runtime, a JavaScript engine could handle the preprocessing however transpilers should be able to pre-compute the outputs so they don't need to be evaluated at runtime.

  • MrBuddyCasino 3 days ago

    I really really (really) don’t want Rust style macros and proc macros in JavaScript (or TypeScript), ever.

    • apatheticonion a day ago

      Might be a good idea to advocate for faster progress in wasm so fans of the feature don't try to pollute the language :p

  • Wintamute 3 days ago

    Writing a web app at the moment with C++/Emscripten. What makes wasm unusable in Rust?

    • apatheticonion a day ago

      It's not unusable per-se, however being unable to take advantage of Rust's fearless concurrency and having to glue everything together with JavaScript severely restrict the usefulness.

      May as well just use TypeScript and React at that point.

      The dream is to be able to specify only a wasm file in an html script tag, have the tab consume under 1mb of memory and maximise the use of client hardware to produce a flawless user experience across all types of hardware.

  • teaearlgraycold 3 days ago

    You want manual memory management for your web apps?

    • tekacs 3 days ago

      Rust memory management is... profoundly not manual?

      Case in point: I use Rust/WASM in all of my web apps to great effect, and memory is never a consideration. In Rust you pretty much never think about freeing or memory.

      On top of that, when objects are moved across to be owned by JS, FinalizationRegistry is able to clean up them up pretty much perfectly, so they're GC-ed as normal.

      • teaearlgraycold 3 days ago

        Wrangling the borrow checker seems pretty manual at times. And I don’t know why you’d bother with a persnickety compile time GC when JS’s GC isn’t a top issue for front end development.

        • zdragnar 3 days ago

          The borrow checker just verifies that you're handling the concept of ownership of memory correctly.

          The actual management of memory- allocating, reclaiming, etc - are all handled automagically for you.

          • auggierose 3 days ago

            There is no need for the concept of ownership of memory in JavaScript. So you are wasting time on a concept that doesn't matter in languages with a real GC. Dealing with ownership = manual memory management.

            • zarzavat 3 days ago

              You can still have ownership issues and leaks even with a GC, if an object is reachable from a root. e.g. object A is in a cache and it references object B which references objects C D E F G ... which will now never get collected.

              If A owns B then that is as expected but if A merely references B then it should hold a WeakRef

            • zdragnar 3 days ago

              This used to not be true- once upon a time, Internet Explorer kept memory separate for DOM nodes and JavaScript objects, so it was very easy to leak memory by keeping reference cycles between the two.

              Now, with all the desire for WASM to have DOM access I wonder if we'll end up finding ourselves back in that position again.

        • apatheticonion a day ago

          You stop noticing the borrow checker after a while and being able to write insanely parallel/performant code is quite rewarding.

          Again, not all websites need to be usable on low end hardware/have a 1mb memory footprint - but there are a lot of use cases that would benefit.

          Think, browser extensions that load on every tab and consume 150mb+ * number of tabs open and shares the main thread with the website.

          ServiceWorkers that sit as background processes in your OS even when the browser is closed, that sort of thing.

        • tekacs 2 days ago

          I use Rust for all the other reasons, real types being a major one of them:

          https://hn.algolia.com/?type=comment&query=typescript%20soun...

          It's kinda exhausting to use TypeScript and run into situations where the type system is more of a suggestion than a rule. Passing around values [1] that have a type annotation but aren't the type they're annotated as is... in many ways worse than not typing them in the first place.

          [1]: not even deserialized ones - ones that only moved within the language!

stevage 3 days ago

I could imagine this being useful for pre-compiling markdown.

anonymoushn 3 days ago

I have read the examples, and it seems like this cannot be used for aggressive hoisting of conditionals by writing "if (comptime foo)", resulting in the the body of the if statement being executed unconditionally or omitted. So it cannot replace my current use of C preprocessor macros in Javascript, though Zig's actual comptime feature could.

  • MrJohz 3 days ago

    This can be used as part of that step (i.e. converting `foo()` into `true` or `false`), but I think the expectation is that you'll have another step in the build process that automatically strips away `if(false)...` statements and inlines `if(true)` ones.

    Almost any minifier will automatically do this, for example, and most can be configured so that they only do constant folding/dead code elimination, so the result will be a file that looks like the one you've written, but with these comptime conditions removed/inlined.

    Obviously with C preprocessor macros, you've got one tool that evaluates the condition and removes the dead code, but with comptime you have more flexibility and your conditions are all written in Javascript rather than a mix of JS and preprocessor macros.

    • anonymoushn 2 days ago

      I think it is a bit more troublesome to ship a this comptime implementation and a minifier to the client so the specialization may be performed there than it is to ship just a c preprocessor implementation.

      • MrJohz 2 days ago

        That'll depend a lot on the context and the client, I imagine. Comptime and the minifier can be distributed in a fairly standard way as part of an NPM package's dependencies, so if you're shipping to a system that can handle NPM, then comptime doesn't really add much. But if the client doesn't have a JS runtime installed or can't easily access the NPM ecosystem, then I can imagine shipping the C preprocessor could well be easier than juggling two different tools.

  • trgwii 3 days ago

    Yes it can, the comptime expression inside the if would turn into a `true` or `false` literal, but you would need a separate build tool to optimize away the if. That's partly why comptime.ts outputs TypeScript iirc.

    I believe both Vite and Bun bundler would apply the optimization to eliminate constant conditionals when you use comptime.ts as a plugin.

    • anonymoushn 2 days ago

      I don't have any experience running the Vite or Bun bundlers or the typescript compiler on the client, but I think these are not really supported use cases.

      • alpinisme 2 days ago

        Why are you trying to run this on the client? At the time you’re shipping code you’re already past the point of comptime. It’d be wasteful to do anything client side beyond execute the code you sent.

mdarens 3 days ago

One of the most exciting features of Zig, but am I correct that this doesn’t apply to types themselves like comptime generics in Zig? I find that to be one of the most powerful ideas: type level mappings that have the same syntax as the runtime code where you can just set an iteration limit. This would be a great way to get around the “too large union” problem in TS, for example.

  • MKRhere 10 hours ago

    Author here. We have an idea in the works to implement `typeInfo`, but it serves a more type -> value usecase (for example, generating validations from types).

    However, going full cycle (type -> value -> type) is not as trivial because we won't get to ride on TypeScript's existing language server support, and solutions such as needing to use our own patched tsserver, etc., are too hacky for my liking.

    Also not possible is generic types as parameters to comptime functions like Zig.

    Happy to discuss more comptime usecases though. Feel free to raise an issue if you'd like to discuss, we can look into feasibility.

Thom2000 3 days ago

Interesting. I've never seen the import-with syntax, though and it's hard to find any documentation on it. Is this a syntax extension?

shortrounddev2 3 days ago

Would be really great if it could return named functions

  • trgwii 3 days ago

    I have had many discussions with the author and we ultimately decided not to support those kinds of usecases until we have a very solid set of guarantees. Supporting closures can quickly become very tricky when you need to preserve a function across JS processes.

    • shortrounddev2 3 days ago

      I just want to be able to select dependencies at bundle time depending on the build environment. If its in dev, use `MockService`. If its in prod, use `ProdService`. Right now I just have `index.prod.ts` and `index.dev.ts` that choose the dependencies, which is not a bad solution, I just wish I could keep my initialization code in one file and have functions return the dependencies based on the environment. I can do this at runtime obviously but it doesnt seem to eliminate unused dependencies well

      I know its a cursed idea but I often find myself wishing typescript had a C++ style preprocessor

      • MKRhere 12 hours ago

        Author here.

        To be clear, what you're asking for is basically:

        const X = comptime(condition ? A : B);

        and have it compile down to

        const X = A;

        without attempting to serialise the functions themselves. Is this correct? The way comptime.ts currently works is that it runs the expression in a constructed block. But perhaps a new primitive, like

        import { conditional } from "comptime.ts" with { type: "comptime" };

        const X = conditional(condition, X, Y);

        Might work though! I'm also interested in conditional comptime code removal, but not sure about the API design there. I know bundlers already do it, but I'd like for it to be possible in source->source transformations too, for example shipping a version of a library with debugs/traces.

        Feel free to open an issue if you'd like to discuss ideas.

revskill 3 days ago

Sweet. No need a framework to do that.