|
|
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1404973 is a reply to message #1404199] |
Wed, 06 August 2014 09:20 |
Jan Koehnlein Messages: 760 Registered: July 2009 Location: Hamburg |
Senior Member |
|
|
A single 10MB file is at the limits of what I'd expect Xtext to be able to handle with reasonable editor performance. This is rather big for a textual file, just imagine a Java file of the same size. So first of all, I'd propose to cut the model into smaller pieces, already for reasons of maintenance. Xtext supports cross-file references nicely OOTB.
But you have memory problems, and the measured 433MB are far below 1GB, so maybe you are not running out of plain heap space (-Xmx1g) but perm gen space (-XX:MaxPermSize=256m). What does the error message say?
The stacktrace doesn't show anything particularly suspicious: In order to avoid unnecessary work on an outdated model, the validator triggers a (partial) reparse of the changed document.
Things you could check:
- Don't use AntLR backtracking (this has to be explicitly enabled in your MWE2 workflow)
- Update to Xtext 2.6, particularly if your language is based on Xbase.
- Reduce lookahead in your grammar, e.g. by adding leading keywords. This can improve the efficiency of partial parsing, as the sections to be reparsed after a document change are much smaller.
Without further information, it's hard to tell what's going on. Maybe you could show us your grammar or try to extract an example to reproduce the error?
---
Get professional support from the Xtext committers at www.typefox.io
|
|
|
|
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1405453 is a reply to message #1404199] |
Thu, 07 August 2014 11:53 |
Lidia Gutu Messages: 45 Registered: July 2013 |
Member |
|
|
Thank you for all your feedback, Its very appreciated.
I am investigating the issue that Martin was so kind to report.
The used jdk is 1.5.0_04.
The VM arguments are -XX:MaxPermSize=256m -Xmx2048m.
The issue we have is that our custom language files based on xtext are very difficult to edit, I am getting responses with a delay, and during minimise/ maximise eclipse its takes some time till the page is back displayed successfully.
I have tested following test cases.
Having left open a file of 10MB.
In order to reproduce the out of memory issue, I tried to decrease value for -Xmx.
Having -Xmx512m -XX:MaxPermSize=256m, When I reopen eclipse, I am getting a pop-up with OutOfMemory error:
"Unable to create editor ID com.foo.mylang.Lang: Editor could not be initialized.
Java heap space"
Having -Xms1024 -XX:MaxPermSize=256m, the eclipse opens successfully, the file of 10MB is displayed.
But I see following OutOfMemoryError stack trace in .log file. (uploaded full sample OutOfMemoryError_1.txt)
!ENTRY org.apache.log4j 4 0 2014-08-07 12:49:29.381
!MESSAGE org.eclipse.xtext.builder.impl.XtextBuilder - java.lang.OutOfMemoryError: Java heap space
!STACK 0
org.eclipse.xtext.parser.ParseException: java.lang.OutOfMemoryError: Java heap space
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doParse(AbstractAntlrParser.java:105)
And the typing works very slow.
If I am trying to maximize eclipse, from minimized state, it takes a while till the file is displayed.
I got a pop-up error "An internal error occurred during: "Xtext validation".
Java heap space" and some OutOfMemory error in log file. ( OutOfMemoryError_2)
Having -Xms1536 -XX:MaxPermSize=256m
I have a running job Building workspace that takes a lot of time, because of those big files. and Typing is very slow.When I try to respose eclipse, I am getting gray screen for a while (Uploaded a sample "Screenshot gray screen.png")
I do not get any errors in .log file.
Having "-Xmx2048m -XX:MaxPermSize=512m "
I am getting same issues: gray screen on maximize, slow typing, no stack traces in .log
I didn't tested with xtext 2.6 yet and I am looking how to ignore AntLR backtracking.
Thank you
Lidia
|
|
|
|
|
|
|
|
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1411179 is a reply to message #1410285] |
Fri, 22 August 2014 15:31 |
Lidia Gutu Messages: 45 Registered: July 2013 |
Member |
|
|
I am still looking for improving my issue with huge dsl files, and maybe following finding can be useful for you.
While testing in debug mode, and having VM -Xmx1024m, I am getting java out of memory error for job 'Updating editor state'
I think this is caused because this job runs in parallel with the other job 'Updating workspace':
66142: ---ABOUT TO RUN Updating editor state
66142: >>>RUN [4]: Updating editor
78848: . SCHED [5]: Updating workspace
80350: >>RUN [2]: Updating workspace
80410: <<DONE [1]: Updating workspace
80442: . SCHED [2]: Updating workspace
81992: >>RUN [2]: Updating workspace
82128: <<DONE [1]: Updating workspace
82147: . SCHED [2]: Updating workspace
83698: >>RUN [2]: Updating workspace
83704: <<DONE [1]: Updating workspace
83709: . SCHED [2]: Updating workspace
83777: . SCHED [4]: IDE Exception Handler
83777: <ERROR [3]: Updating editor state
!ENTRY org.eclipse.core.jobs 4 2 2014-08-22 17:09:28.795
!MESSAGE An internal error occurred during: "Updating editor state".
!STACK 0
java.lang.OutOfMemoryError: Java heap space
at org.antlr.runtime.ANTLRReaderStream.load(ANTLRReaderStream.java:78)
at org.antlr.runtime.ANTLRReaderStream.<init>(ANTLRReaderStream.java:53)
at org.antlr.runtime.ANTLRReaderStream.<init>(ANTLRReaderStream.java:45)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doParse(AbstractAntlrParser.java:62)
at org.eclipse.xtext.parser.AbstractParser.parse(AbstractParser.java:32)
at org.eclipse.xtext.parser.impl.PartialParsingHelper.fullyReparse(PartialParsingHelper.java:213)
at org.eclipse.xtext.parser.impl.PartialParsingHelper.reparse(PartialParsingHelper.java:110)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doReparse(AbstractAntlrParser.java:136)
at org.eclipse.xtext.parser.AbstractParser.reparse(AbstractParser.java:48)
at org.eclipse.xtext.resource.XtextResource.update(XtextResource.java:232)
at org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy.doReconcile(XtextDocumentReconcileStrategy.java:125)
at org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy.reconcile(XtextDocumentReconcileStrategy.java:55)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler.doRun(XtextReconciler.java:363)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler.access$2(XtextReconciler.java:350)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$DocumentListener$1.process(XtextReconciler.java:112)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$DocumentListener$1.process(XtextReconciler.java:1)
at org.eclipse.xtext.util.concurrent.IUnitOfWork$Void.exec(IUnitOfWork.java:36)
at org.eclipse.xtext.util.concurrent.AbstractReadWriteAcces.modify(AbstractReadWriteAcces.java:81)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.modify(XtextDocument.java:219)
at org.eclipse.xtext.util.concurrent.AbstractReadWriteAcces.process(AbstractReadWriteAcces.java:111)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$DocumentListener.performNecessaryUpdates(XtextReconciler.java:109)
at org.eclipse.xtext.ui.editor.model.XtextDocument.updateContentBeforeRead(XtextDocument.java:165)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.beforeReadOnly(XtextDocument.java:186)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.beforeReadOnly(XtextDocument.java:1)
at org.eclipse.xtext.util.concurrent.AbstractReadWriteAcces.readOnly(AbstractReadWriteAcces.java:61)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.readOnly(XtextDocument.java:246)
at org.eclipse.xtext.ui.editor.model.XtextDocument.readOnly(XtextDocument.java:84)
at org.eclipse.xtext.ui.editor.DirtyStateEditorSupport$UpdateEditorStateJob.run(DirtyStateEditorSupport.java:123)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
I hope that rescheduling the jobs will help.
Do you have a better idea how to solve it?
Thanks
[Updated on: Fri, 22 August 2014 15:37] Report message to a moderator
|
|
|
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1462404 is a reply to message #1404973] |
Wed, 05 November 2014 14:43 |
Lidia Gutu Messages: 45 Registered: July 2013 |
Member |
|
|
Hello,
Can someone help me please! I am trying to reduce lookahead from *mwe or from grammar (*xtext)
How can I add the k parameter in *mwe2, Seems that this bug was not solved in xtext 2.6.0 as following has an error message
fragment = parser.antlr.ex.rt.AntlrGeneratorFragment auto-inject {
options = { k=4 backtrack = false classSplitting = true ignoreCase = true }
}
What is wrong with this sample of the grammar?
How can I improve it by adding leading keywords?
grammar com.dsl.lidia.MyDSL with org.eclipse.xtext.common.Terminals
generate myDSL "http://www.dsl.com/lidia/MyDSL"
Model:
(definitions+=Definition)*;
Definition:
InterfaceDefinition
;
InterfaceDefinition
: "interface" name=DSLID "{" (interfaceElementList+=InterfaceElement)* "}"
;
InterfaceElement
: FunctionDeclaration
;
FunctionDeclaration
: functionType=FunctionType name=DSLID "(" ")" (";")?
;
enum FunctionType
: Function = "function"
| NewOption = "NewOption"
| NewOption2 = "NewOption2"
;
DSLID:
'value1'|'value2'
|'value3'|'value4'
|'value5'|'value6'|'value7'
|'value8'|'value9'
|'value10'|'value11'|'value12'
|'value13'|'value14'
|'value15'
|'function'|
|DSLID
;
DSLID : ID
Thank you in advance
Lidia
[Updated on: Wed, 05 November 2014 14:47] Report message to a moderator
|
|
|
|
Powered by
FUDForum. Page generated in 0.04023 seconds