-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extremely long compile times #21
Comments
I thought I had an issue open for this but it appears I had forgotten open one. rustc seems to have some issues with deeply nested types + associated types (see rust-lang/rust#21231). To workaround it until it gets fixed you have to write smaller parsers, ie factor out parts into freestanding functions instead of storing them in local variables. Example at: https://github.com/Marwes/parser-combinators/blob/master/benches/json.rs#L110-L128 (It might be possible to specialize the parsers directly as well, say fn expr(input: State<&str>) -> ParseResult<Expr, &str> instead of fn expr<I: Stream>(input: State<I>) -> ParseResult<I, &str> ) |
@Marwes Excellent, thank you very much for the quick advice, I really appreciate that. I'll try refactoring some of my code and let you know if I'm still having problems. |
Okay, I did some refactoring (hawkw/seax@96af0b7) and compile times have reduced significantly. My laptop no longer smells like it's melting, and maybe I can even turn lint-on-save back on. Thanks so much. |
Specialized the parsers to parse only over `&str`s, rather than `Stream<Item=char>`. I’d rather the parsers be able to operate over any stream of characters, but this improves the compile time somewhat (see Marwes/combine#21). Once the rustc issues are resolved, it should be trivial to re-generalize the parsers.
I happened upon a workaround to half of this problem, namely the part where functions need to be specialized. Say you have a bunch of parser functions: fn parse1<I>(input: State<I>) -> ParseResult<i32, I> where I: Stream { ... }
fn parse2<I>(input: State<I>) -> ParseResult<i32, I> where I: Stream { ... }
fn parse3<I>(input: State<I>) -> ParseResult<i32, I> where I: Stream { ... } If you move all of these functions into an impl block for a type you can get the same compile times as with the specialized version like this: struct P<I>(PhantomData<fn (I) -> I>);
impl <I> P<I>
where I: Stream {
fn parse1(input: State<I>) -> ParseResult<i32, I> { ... }
fn parse2(input: State<I>) -> ParseResult<i32, I> { ... }
fn parse3(input: State<I>) -> ParseResult<i32, I> { ... }
} In addition to this, when these functions refer to each other they need to be specialized, i.e write |
Interesting! I don't think that will help my current situation particularly (all of my parsers are specialized and that's not particularly a problem; compile times are still slow since they're still somewhat complex), but it's good to know! |
rust-lang/rust#22204 might be related too. |
rust-lang/rust#20304 also seems to be an issue and reducing the nesting of types does not seem to workaround that problem either. After using the above techniques on https://github.com/Marwes/parser-combinators-language the trans phase still takes a very long time with about half the total time spent just normalizing associated types =/ |
I guess it should also be mentioned that moving the parser implementation to a separate crate which only expose specialized parser functions can be used to avoid long compile times on every recompile as rustc can just link the the already compiled code in that case, avoiding a full recompilation. |
I have done things recommended here, but I still get a compile time > 1 minute. Do you know what I can do to improve on things? |
@Ralle Either wrap all your functions in a type as I suggested above or specialize all your If you still find the compilation times move the parser to its own crate and you will at least avoid the compile times when you do not modify the parser. |
I have been meaning to add the project that I use combine and combine-language in pretty soon though I have not had the time to get it into proper shape yet. That should display what I have done to make it work with a large parser in a compiler. |
I would love to see that whenever it's ready to be shown to the public; I've been thinking about trying to refactor my Scheme parser some, and I'd like to see how you're using the library in your own code. |
Thank you for all your help so far. I upgraded to the latest Rust and did the struct thing that you showed here. Still 1m 20s, I shaved off 20s. Any more suggestions? |
Reducing how many levels you nest parsers can reduce the typechecking times a bit but unfortunately the compile times will always be quite long compared to other rust code. I did some modifications to your gist, changing The best thing really, is to have the parser in separate crate as mentioned above. In the project I mentioned above it takes about EDIT: When I compile without needing to recompile the parser a compilation takes just 20s. |
Which is a shame, because of how one is generally kind of intended to use parser combinators. :) |
Its a shame that its necessary to reduce some nesting of the types but I'd say its generally a good idea to create many small parsers anyway. If really need to create a large parser you either have to wrap it in a closure ( |
Your suggestions helped me a lot. Thank you! |
I finally got the compiler into bit better shape and so I moved it to github. You can find it here. The parser is found in the parser folder for anyone looking for how I use combine. The full compile times are indeed long but as long as the parser is not modified it is as fast as any other rust program. |
There seems to be a regression in typechecking in recent nightlies as well as the 1.7.0 beta. If you are using a nightly version I really recommend using one before these regressions as compiletimes are really bad regardless of the workarounds. My 6 year old laptop takes 35 minutes when compiling tests (for combine) vs 2 minutes before the regression. Rust issue: rust-lang/rust#31157 |
It seems to be better on the latest nightly-2016-06-06. I see 4х improvement in compile time. |
@pepyakin Thanks for reminding me to post here! Yeah, with rust-lang/rust#33816 merged I recommend using a nightly after 2016-06-05 until the release trains catch up. |
Closing this as with the release of rust 1.11 there is no longer an exponential compile time problem (compiling can still be a little slow due the the amount of types created but its no longer critical issue that has to be worked around). |
I know this issue is a little vague, so please bear with me. As I've started using
parser-combinators
, I've noticed that the compile times for my project have sharply increased; more than I would expect from the amount of code I've written.cargo build
now takes 8 to 11 minutes to complete. This is on a fairly new machine, too - I have a MacBook Pro with a 2.5GHz quad-core i7, and I've generally seen very good performance fromcargo
/rustc
.I guess I'm just curious to know what's causing these very long compile times. Is this to be expected when using this library, am I misusing it in some way that's confusing
rustc
, or is something else wrong? Of course, it's likely difficult to determine exactly what's going on, but I'd welcome any additional information.The text was updated successfully, but these errors were encountered: