Skip to main content


Eclipse Community Forums
Forum Search:

Search      Help    Register    Login    Home
Home » Modeling » TMF (Xtext) » OutOfMemoryError when opening a large file in an XText-generated language editor
OutOfMemoryError when opening a large file in an XText-generated language editor [message #1404199] Wed, 30 July 2014 11:21 Go to next message
Martin Oberhuber is currently offline Martin OberhuberFriend
Messages: 1007
Registered: July 2009
Senior Member
Dear all,

We have generated a language IDE using XText, the runtime uses Eclipse-3.8.2 / EMF-2.9.2 / org.eclipse.xtext.runtime_2.5.2.v201402120812 .

When loading a large file (10MB in size) into the language IDE's Editor and starting to type, the IDE becomes extremely slow and eventually runs into an OutOfMemory error even with > 1GB heap space.

We checked for leak suspects in Eclipse MemoryAnalyzer (attached screenshot + backtrace): 33MB are retained by ANTLRReaderStream, and 150MB are retained by XTextTokenStream. But the bigger problem seems that the editor appears to re-parse the file multiple times in a Job as per the backtrace (reconcile... frames).

My first question would be if such scalability problems are to be expected when parsing a 10MB file in the editor ... could it be that our grammar is specified in a non-ideal way ?

Or checking the other way round: if editing 10MB files is unreasonable, then I'm wondering if XText Editors could have something like the CDT's C/C++ Editor "scalability mode" (Preferences > C/C++ > Editor > Scalability). This would turn off certain parsing functions when opening a file that's larger than a certain threshold.

We would appreciate any pointers or thoughts how to best proceed.

Thanks!
Martin
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1404957 is a reply to message #1404199] Wed, 06 August 2014 08:39 Go to previous messageGo to next message
Jeff MAURY is currently offline Jeff MAURYFriend
Messages: 44
Registered: July 2009
Member
Can you show the exception message ? Which JDK are you using ? That may be related to the PermGen. Can you show the Eclipse settings (JVM args) ?
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1404973 is a reply to message #1404199] Wed, 06 August 2014 09:20 Go to previous messageGo to next message
Jan Koehnlein is currently offline Jan KoehnleinFriend
Messages: 760
Registered: July 2009
Location: Hamburg
Senior Member
A single 10MB file is at the limits of what I'd expect Xtext to be able to handle with reasonable editor performance. This is rather big for a textual file, just imagine a Java file of the same size. So first of all, I'd propose to cut the model into smaller pieces, already for reasons of maintenance. Xtext supports cross-file references nicely OOTB.

But you have memory problems, and the measured 433MB are far below 1GB, so maybe you are not running out of plain heap space (-Xmx1g) but perm gen space (-XX:MaxPermSize=256m). What does the error message say?

The stacktrace doesn't show anything particularly suspicious: In order to avoid unnecessary work on an outdated model, the validator triggers a (partial) reparse of the changed document.

Things you could check:
- Don't use AntLR backtracking (this has to be explicitly enabled in your MWE2 workflow)
- Update to Xtext 2.6, particularly if your language is based on Xbase.
- Reduce lookahead in your grammar, e.g. by adding leading keywords. This can improve the efficiency of partial parsing, as the sections to be reparsed after a document change are much smaller.

Without further information, it's hard to tell what's going on. Maybe you could show us your grammar or try to extract an example to reproduce the error?


---
Get professional support from the Xtext committers at www.typefox.io
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1405012 is a reply to message #1404199] Wed, 06 August 2014 11:23 Go to previous messageGo to next message
Lieven Lemiengre is currently offline Lieven LemiengreFriend
Messages: 3
Registered: September 2010
Junior Member
In our plugin we often have to deal with >10MB files, these are typically generated files so normally users don't have to edit them. We build a special 'lightweight' editor to deal with these files. The lightweight editor doesn't have type-time error checking, no semantic highlighting, jump to declaration etc. it has basic syntax highlighting and error markers. The error markers are updated when the users save the file. Even with this editor editing can still be slow but the bottleneck is now in the eclipse editor.

Building this is a few days work, I don't have any code to share but it's not hard (if you are familiar with eclipse).

We use an IContentDescriber to decide what editor should be loaded.
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1405453 is a reply to message #1404199] Thu, 07 August 2014 11:53 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
Thank you for all your feedback, Its very appreciated.

I am investigating the issue that Martin was so kind to report.

The used jdk is 1.5.0_04.
The VM arguments are -XX:MaxPermSize=256m -Xmx2048m.
The issue we have is that our custom language files based on xtext are very difficult to edit, I am getting responses with a delay, and during minimise/ maximise eclipse its takes some time till the page is back displayed successfully.

I have tested following test cases.
Having left open a file of 10MB.

In order to reproduce the out of memory issue, I tried to decrease value for -Xmx.

Having -Xmx512m -XX:MaxPermSize=256m, When I reopen eclipse, I am getting a pop-up with OutOfMemory error:
"Unable to create editor ID com.foo.mylang.Lang: Editor could not be initialized.
Java heap space"

Having -Xms1024 -XX:MaxPermSize=256m, the eclipse opens successfully, the file of 10MB is displayed.
But I see following OutOfMemoryError stack trace in .log file. (uploaded full sample OutOfMemoryError_1.txt)
!ENTRY org.apache.log4j 4 0 2014-08-07 12:49:29.381
!MESSAGE org.eclipse.xtext.builder.impl.XtextBuilder - java.lang.OutOfMemoryError: Java heap space
!STACK 0
org.eclipse.xtext.parser.ParseException: java.lang.OutOfMemoryError: Java heap space
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doParse(AbstractAntlrPa
rser.java:105)


And the typing works very slow.
If I am trying to maximize eclipse, from minimized state, it takes a while till the file is displayed.
I got a pop-up error "An internal error occurred during: "Xtext validation".
Java heap space" and some OutOfMemory error in log file. ( OutOfMemoryError_2)


Having -Xms1536 -XX:MaxPermSize=256m
I have a running job Building workspace that takes a lot of time, because of those big files. and Typing is very slow.When I try to respose eclipse, I am getting gray screen for a while (Uploaded a sample "Screenshot gray screen.png")
I do not get any errors in .log file.


Having "-Xmx2048m -XX:MaxPermSize=512m "
I am getting same issues: gray screen on maximize, slow typing, no stack traces in .log

I didn't tested with xtext 2.6 yet and I am looking how to ignore AntLR backtracking.

Thank you
Lidia
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1405470 is a reply to message #1404973] Thu, 07 August 2014 12:33 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
I got another useful sample of Out Of Memory stack trace error, very similar from first one posted by Martin taken from the Heap Dump.
//--------full sample in uploaded file OutOfMemoryError_3.txt-----------------

ERROR org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy - Parsing in reconciler failed.
org.eclipse.xtext.parser.ParseException: java.lang.OutOfMemoryError: Java heap space
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doParse(AbstractAntlrParser.java:105)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.parse(AbstractAntlrParser.java:84)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doParse(AbstractAntlrParser.java:62)
at org.eclipse.xtext.parser.AbstractParser.parse(AbstractParser.java:32)
at org.eclipse.xtext.parser.impl.PartialParsingHelper.fullyReparse(PartialParsingHelper.java:213)
at org.eclipse.xtext.parser.impl.PartialParsingHelper.reparse(PartialParsingHelper.java:110)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doReparse(AbstractAntlrParser.java:136)
at org.eclipse.xtext.parser.AbstractParser.reparse(AbstractParser.java:48)
at org.eclipse.xtext.resource.XtextResource.update(XtextResource.java:233)
at org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy.doReconcile(XtextDocumentReconcileStrategy.java:135)
at org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy.reconcile(XtextDocumentReconcileStrategy.java:56)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler.doRun(XtextReconciler.java:368)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler.access$2(XtextReconciler.java:354)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$1.process(XtextReconciler.java:298)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$1.process(XtextReconciler.java:1)
....

//--------------------------------------------
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1405779 is a reply to message #1405470] Fri, 08 August 2014 06:51 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
I would like to correct myself about used jdk version, It's 1.6.0_21, not 1.5
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1405895 is a reply to message #1404973] Fri, 08 August 2014 12:25 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
Jan, Thank you for your suggestions, but from what I figured out so far, our current dsl language doesn't have the AntLR backtracking enabled, and this is the default feature. Please correct me If I am wrong.

The *.mwe2 file has backtrack option commented:

// The antlr parser generator fragment.
fragment = parser.antlr.XtextAntlrGeneratorFragment auto-inject {
   //  options = {
   //      backtrack = true
   //  }
}

Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1408191 is a reply to message #1405012] Thu, 14 August 2014 13:13 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
Hi Lieven, all

Could you be so kind to provide some hints about how to disable some features profor big dsl files. I want to disable parsing, and maybe refreshing outline view only when a big file is loaded. So far, I just commented the reconcile method, and I am not sure if this is the right approach.
I am trying to make a big dsl file writable, and trying to turn off all functions that prevents to type on it.

I am looking into Validation, type checking and scoping.
Found so far that the NamesAreUniqueValidator has been enabled and the annotation has been added.

 
fragment = validation.JavaValidatorFragment {
     composedCheck = "org.eclipse.xtext.validation.NamesAreUniqueValidator"
}

@ComposedChecks(validators= {org.eclipse.xtext.validation.NamesAreUniqueValidator.class}) 
public class AbstractDslJavaValidator extends AbstractDeclarativeValidator { }


If this is the right approach, how I can disable that annotation for big files?
Should be there a one single pace to disable it?
Thank you
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1410285 is a reply to message #1405012] Wed, 20 August 2014 10:08 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
Can you please help me? What mwe2 workflow should be used for lightweight editor.
What fragments are mandatory?
I kept only following, but the editor cannot be displayed
fragment = grammarAccess.GrammarAccessFragment auto-inject {}
fragment = parser.antlr.XtextAntlrGeneratorFragment auto-inject {}
fragment = parser.antlr.XtextAntlrUiGeneratorFragment auto-inject {}

and I am getting following error message.
Could not open the editor: The editor class could not be instantiated. This usually indicates a missing no-arg constructor or that the editor's class name was mistyped in plugin.xml.

Thank you
Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1411179 is a reply to message #1410285] Fri, 22 August 2014 15:31 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
I am still looking for improving my issue with huge dsl files, and maybe following finding can be useful for you.

While testing in debug mode, and having VM -Xmx1024m, I am getting java out of memory error for job 'Updating editor state'
I think this is caused because this job runs in parallel with the other job 'Updating workspace':
66142: ---ABOUT TO RUN  Updating editor state
   66142: >>>RUN   [4]: Updating editor 
   78848: . SCHED [5]: Updating workspace
   80350: >>RUN   [2]: Updating workspace
   80410: <<DONE  [1]: Updating workspace
   80442: . SCHED [2]: Updating workspace
   81992: >>RUN   [2]: Updating workspace
   82128: <<DONE  [1]: Updating workspace
   82147: . SCHED [2]: Updating workspace
   83698: >>RUN   [2]: Updating workspace
   83704: <<DONE  [1]: Updating workspace
   83709: . SCHED [2]: Updating workspace
   83777: . SCHED [4]: IDE Exception Handler
   83777: <ERROR [3]: Updating editor state


!ENTRY org.eclipse.core.jobs 4 2 2014-08-22 17:09:28.795
!MESSAGE An internal error occurred during: "Updating editor state".
!STACK 0
java.lang.OutOfMemoryError: Java heap space
at org.antlr.runtime.ANTLRReaderStream.load(ANTLRReaderStream.java:78)
at org.antlr.runtime.ANTLRReaderStream.<init>(ANTLRReaderStream.java:53)
at org.antlr.runtime.ANTLRReaderStream.<init>(ANTLRReaderStream.java:45)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doParse(AbstractAntlrParser.java:62)
at org.eclipse.xtext.parser.AbstractParser.parse(AbstractParser.java:32)
at org.eclipse.xtext.parser.impl.PartialParsingHelper.fullyReparse(PartialParsingHelper.java:213)
at org.eclipse.xtext.parser.impl.PartialParsingHelper.reparse(PartialParsingHelper.java:110)
at org.eclipse.xtext.parser.antlr.AbstractAntlrParser.doReparse(AbstractAntlrParser.java:136)
at org.eclipse.xtext.parser.AbstractParser.reparse(AbstractParser.java:48)
at org.eclipse.xtext.resource.XtextResource.update(XtextResource.java:232)
at org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy.doReconcile(XtextDocumentReconcileStrategy.java:125)
at org.eclipse.xtext.ui.editor.reconciler.XtextDocumentReconcileStrategy.reconcile(XtextDocumentReconcileStrategy.java:55)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler.doRun(XtextReconciler.java:363)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler.access$2(XtextReconciler.java:350)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$DocumentListener$1.process(XtextReconciler.java:112)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$DocumentListener$1.process(XtextReconciler.java:1)
at org.eclipse.xtext.util.concurrent.IUnitOfWork$Void.exec(IUnitOfWork.java:36)
at org.eclipse.xtext.util.concurrent.AbstractReadWriteAcces.modify(AbstractReadWriteAcces.java:81)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.modify(XtextDocument.java:219)
at org.eclipse.xtext.util.concurrent.AbstractReadWriteAcces.process(AbstractReadWriteAcces.java:111)
at org.eclipse.xtext.ui.editor.reconciler.XtextReconciler$DocumentListener.performNecessaryUpdates(XtextReconciler.java:109)
at org.eclipse.xtext.ui.editor.model.XtextDocument.updateContentBeforeRead(XtextDocument.java:165)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.beforeReadOnly(XtextDocument.java:186)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.beforeReadOnly(XtextDocument.java:1)
at org.eclipse.xtext.util.concurrent.AbstractReadWriteAcces.readOnly(AbstractReadWriteAcces.java:61)
at org.eclipse.xtext.ui.editor.model.XtextDocument$XtextDocumentLocker.readOnly(XtextDocument.java:246)
at org.eclipse.xtext.ui.editor.model.XtextDocument.readOnly(XtextDocument.java:84)
at org.eclipse.xtext.ui.editor.DirtyStateEditorSupport$UpdateEditorStateJob.run(DirtyStateEditorSupport.java:123)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)


I hope that rescheduling the jobs will help.
Do you have a better idea how to solve it?
Thanks

[Updated on: Fri, 22 August 2014 15:37]

Report message to a moderator

Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1462404 is a reply to message #1404973] Wed, 05 November 2014 14:43 Go to previous messageGo to next message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
Hello,
Can someone help me please! I am trying to reduce lookahead from *mwe or from grammar (*xtext)

How can I add the k parameter in *mwe2, Seems that this bug was not solved in xtext 2.6.0 as following has an error message

fragment = parser.antlr.ex.rt.AntlrGeneratorFragment auto-inject {
             options = { k=4 backtrack = false classSplitting = true  ignoreCase = true  }        
}


What is wrong with this sample of the grammar?
How can I improve it by adding leading keywords?

grammar com.dsl.lidia.MyDSL with org.eclipse.xtext.common.Terminals

generate myDSL "http://www.dsl.com/lidia/MyDSL"

Model:
   (definitions+=Definition)*;
   
  
Definition:
    InterfaceDefinition
   ;
      
InterfaceDefinition
   : "interface" name=DSLID "{" (interfaceElementList+=InterfaceElement)* "}"
   ;
   
   
InterfaceElement
   :  FunctionDeclaration
   ;

FunctionDeclaration
   :  functionType=FunctionType name=DSLID    "("   ")" (";")?
   ;   
      
enum FunctionType
   : Function  =  "function"
   | NewOption   =  "NewOption"
   | NewOption2     =  "NewOption2"
   ;
   
DSLID:
   'value1'|'value2'
   |'value3'|'value4'
   |'value5'|'value6'|'value7'
   |'value8'|'value9'
   |'value10'|'value11'|'value12'
   |'value13'|'value14'
   |'value15'
   |'function'|
   |DSLID
   ;     
   
DSLID : ID 
   


Thank you in advance
Lidia

[Updated on: Wed, 05 November 2014 14:47]

Report message to a moderator

Re: OutOfMemoryError when opening a large file in an XText-generated language editor [message #1503914 is a reply to message #1405012] Mon, 08 December 2014 22:19 Go to previous message
Lidia Gutu is currently offline Lidia GutuFriend
Messages: 45
Registered: July 2013
Member
Hello Lieven

Could you please share a piece of code about how you use
IContentDescriber to decide what editor should be loaded.

I am tired of trying to make it work
I have posted a separate question about my tries.

https://www.eclipse.org/forums/index.php/m/1503901/#msg_1503901

Thank you in advance
Kind Regards
Lidia
Previous Topic:sometimes "resolution of uriFrgament '|0' failed." after Migration to Eclipse 4.4.1
Next Topic:Multiplexing Xtext lexers?
Goto Forum:
  


Current Time: Sat Apr 20 05:51:13 GMT 2024

Powered by FUDForum. Page generated in 0.04023 seconds
.:: Contact :: Home ::.

Powered by: FUDforum 3.0.2.
Copyright ©2001-2010 FUDforum Bulletin Board Software

Back to the top