Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal: AST mode #887

Closed
wants to merge 6 commits into from
Closed

Proposal: AST mode #887

wants to merge 6 commits into from

Conversation

andrewbranch
Copy link
Contributor

@andrewbranch andrewbranch commented Dec 21, 2018

Adds a loader option ast that causes files to be piped through as a JSON version of its abstract syntax tree. Useful for generating custom docs and such.

(I actually tried this in the past with TypeStrong/TypeDoc by making typedoc-loader, but found that it doesn’t give me enough information.)

This functionality could easily be a different loader entirely, but I think there's some performance benefit to recycling the same ts.Program that ts-loader is already using, and I think it's likely that anyone who wants to get the TypeScript AST of a file into their Webpack-bundled app is probably already using ts-loader 😄

Borrowed some guidance from https://astexplorer.net (specifically, the circular-reference-safe JSON stringify package they use).

This is in the README, but want to specifically point out two things:

  1. To consume the AST, you need json-loader to convert this loader’s output to a module
  2. I would recommend using inline loader syntax:
const ast = require('!json-loader!ts-loader?ast!./source-file.ts');

@johnnyreilly
Copy link
Member

This seems super cool! Good work! I can see you're having problems getting your comparison test pass on Windows; I'd lay money on this being something to do with the difference between \n and \r\n. (You may already have this in hand)

I've no objection to this becoming part of ts-loader given that you (and therefore likely others too) will find utility in it. That said, I'd love to understand the use case a little better, so I've a number of questions:

  • Could you take me through an example scenario where you imagine using this please? What would you do with your AST? You mention documentation; I wonder if you could be more concrete in your example?

  • Would you use this functionality at the same time as you built an app or never at the same time as you built your app? i.e. Do you imagine having ts-loader building the app and building the documentation as part of the same build step. Or is it one or the other?

  • Would you use this in watch mode?

@andrewbranch
Copy link
Contributor Author

Good questions!

You mention documentation; I wonder if you could be more concrete in your example?

The actual thing that I've tried to do with typedoc-loader is make a self-generating docs site for an internal React component library (written in TypeScript). Assuming some amazing helper functions for parsing the AST that I haven't yet written, you can imagine a starting point like:

export class ComponentDocumentationPage extends React.Component<{ componentName: string }> {
  render() {
    const { componentName } = this.props;
    const ast = require(`!json-loader!ts-loader?ast!../components/${componentName}.tsx`);
    const componentNode = getExportedReactComponent(ast, componentName);
    const componentExample = getJSDocExample(componentNode);
    const propsInterfaceName = getPropsInterfaceName(componentNode);
    const propsInterfaceNode = getExportedInterface(ast, propsInterfaceName);
    const propsExtendsName = getExtendedTypeName(propsInterfaceNode);
    return (
      <Page title={componentName}>
        <Markdown>{getJSDocDescription(componentNode)}</Markdown>
        {componentExample ? tryRenderLiveExample(componentExample) : null}
        <h2><code>{propsInterfaceName}</code></h2>
        {propsExtendsName ? <h3>extends {propsExtendsName}</h3> : null}
        <ul>
          {getOwnMembers(propsInterfaceNode).map(prop => (
            <li>{prop.name} ({getSignatureString(prop)}): {getJSDocDescription(prop)}</li>
          ))}
        </ul>
      </Page>
    );
  }
}

and then you can easily have:

const ButtonDocs = <ComponentDocumentationPage componentName="button" />;

assuming you have a ../components/button.tsx.

(In fact, the thing that TypeDoc doesn't give me even in this simple example, is a reliable way even just to print what ButtonProps extends if it's a literal type expression such as Partial<BaseButtonProps>. TypeDoc strips out a lot of information that was originally present in the AST, including the raw text of the source file, which could be used as a worst-case fallback for letting the reader know what type is being extended.)

Would you use this functionality at the same time as you built an app or never at the same time as you built your app? i.e. Do you imagine having ts-loader building the app and building the documentation as part of the same build step. Or is it one or the other?

Yeah, this is only valuable as a part of ts-loader as opposed to being its own loader if you're already using ts-loader for something, which is why I anticipate using this mode explicitly via the require loader syntax. You can see in my example above that the component being documented is imported as an AST, but it could be imported as a normal component for actual usage at the same time. E.g., my example ComponentDocumentationPage is itself a component written in TypeScript that will be loaded with ts-loader; it consumes other components also written in TypeScript and imported normally (like Page and Markdown). You can imagine an element <ComponentDocumentationPage componentName="markdown" /> which both documents the Markdown component, using the Markdown component to do it. 😄

Would you use this in watch mode?

Yes. Am I missing something to make that work correctly? I assumed since getting the AST of a file doesn't depend on the state of other files (like getting diagnostics for a file does), watching would Just Work™ out of the box.


I also wanted to get your opinion on this design decision:

To consume the AST, you need json-loader to convert this loader’s output to a module

This is an arbitrary requirement based on the fact that loaders are generally "supposed" to do one thing only and be as modular as possible. But, I'm not sure how practical that really is in this case, and it forces the user to download (and write in the import statement) json-loader as an extra dependency, when I could just add module.exports = to the front of the output I emit to avoid forcing that extra dependency onto people. Thoughts?

@andrewbranch
Copy link
Contributor Author

I can see you're having problems getting your comparison test pass on Windows

Ahh, good times. There's an absolute path somewhere in the AST.

@johnnyreilly
Copy link
Member

Yes. Am I missing something to make that work correctly? I assumed since getting the AST of a file doesn't depend on the state of other files (like getting diagnostics for a file does), watching would Just Work™ out of the box.

You should be fine!

This is an arbitrary requirement based on the fact that loaders are generally "supposed" to do one thing only and be as modular as possible. But, I'm not sure how practical that really is in this case, and it forces the user to download (and write in the import statement) json-loader as an extra dependency, when I could just add module.exports = to the front of the output I emit to avoid forcing that extra dependency onto people. Thoughts?

Since I don't have a usage for this functionality myself I'm somewhat on the fence. That said, although the documentation isn't super clear: https://webpack.js.org/loaders/json-loader/

I was under the impression that json-loader no longer had to be specifically added / configured. That since webpack 2 it was "in the box".

What's your gut feeling? Or to put it another way, can I suggest you take both approaches for a test drive on a test project of your own and reach a conclusion about which feels better from a consumption point of view. And then we go with that. Ultimately the first customer of this feature is you; pick what feels "nicest".

@johnnyreilly
Copy link
Member

The comparison tests look painful. They always are. Can I suggest we try to use an execution test here instead? I'd actually prefer it as they tend to be far more reliable.

I'd also like to reduce the pain in your life! Merry Christmas 🎅🌲!

@andrewbranch
Copy link
Contributor Author

Update here: what I really want to do actually involves using the type checker as well, which means the unbound AST alone won't do. It's also way easier to work with the TypeScript API when it's the real TypeScript API, not a serialized version of it. I'm experimenting (outside of ts-loader) with running a bunch of static analysis at the loader level, then serializing just what I need. As I'm doing these experiments, I'm fully understanding for the first time the value of all the file caching and dependency tracking that ts-loader does 😄

I'm thinking what might be a better approach for something like this might be to provide a function that allows the consumer to work with the compiler on the loader side and return a serialized whatever they want:

// In client code
require('ts-loader?serializer=../serializers/props&serializationOptions[componentName]=Button!../components/button.tsx')
// ../serializers/props.ts
export default function serializeProps(program: ts.Program, fileName: string, serializationOptions: any) {
  const sourceFile = program.getSourceFile(fileName);
  const typeChecker = program.getTypeChecker();
  const { componentName } = serializationOptions;
  // The world is your oyster
  return JSON.stringify(...);
}

I had begun to think that this is too abstract and edge-casey to put into ts-loader, but if someone wanted to do this sort of thing on their own, the cost is very high to get it working with a watch. What do you think about this direction?

@johnnyreilly
Copy link
Member

johnnyreilly commented Jan 4, 2019

I'm thinking what might be a better approach for something like this might be to provide a function that allows the consumer to work with the compiler on the loader side and return a serialized whatever they want:

That sounds like a good approach as it puts the power in the users hands.

I had begun to think that this is too abstract and edge-casey to put into ts-loader, but if someone wanted to do this sort of thing on their own, the cost is very high to get it working with a watch. What do you think about this direction?

It does feel like a very abstract / edge-casey thing to put into ts-loader. That said, I'm not opposed to that provided a number of conditions are met:

  • normal usage of ts-loader (by which I mean performance really) is not impacted
  • impact on the codebase is not invasive (which I don't think it should be / is so far)
  • the new functionality is extremely well documented; it feels like a very niche need but the sort of thing that 1% of people would care about a great deal. So if it's in it must be very clear how it can be used so they can be enabled. No point having an awesome feature in a product that no-one understands 😄

I reckon, keep on experimenting and keep chatting about it. Happy to be a sounding board. Happy new year!

@andrewbranch
Copy link
Contributor Author

Thanks for the input! I’m going to close this and hopefully come back with a new one soon. Happy new year!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants