|
Re: Help requested for closure constraint. [message #1053593 is a reply to message #1053529] |
Mon, 06 May 2013 12:08  |
Eclipse User |
|
|
|
Hi
What you are describing is the consequence of a depth-first traversal.
A breath-first traversal can naturally tail recurse and so avoid any
significant stack growth.
Please submit a Bugzilla.
You may find the improved modularity of the new pivot implementation
(org.eclipse.ocl.examples.library.iterator.ClosureIteration) easier to
understand and replace.
Regards
Ed Willink
On 06/05/2013 13:46, Daniel Di Nardo wrote:
> Dear readers,
>
> I'm trying to implement a constraint on the members of a linked list described by the following UML class diagrams:
>
>
>
> In plain English, I want to ensure that for any given node, that the first predecessor node of the same channelId type has a frameCount that is one less than the current given node's frameCount. I introduced an index value to keep track of the order in which I created the nodes but I'm hoping that using this value won't be necessary.
>
> My constraint is currently as follows:
>
>
> class SampleNode inv:
>
> let allNodes : Set(SampleNode) = self->closure(prev),
> allNodesWithSameChannelId : Set(SampleNode) = allNodes->select(x : SampleNode | x.frameCount = self.frameCount),
> orderedNodesWithSameChannelId : OrderedSet(SampleNode) = allNodesWithSameChannelId->sortedBy( index )
> in
>
> orderedNodesWithSameChannelId->size() > 0
>
> implies
>
> self.frameCount = orderedNodesWithSameChannelId->last().frameCount + 1
>
>
>
> The issue I'm having is that while the above constraint works when we evaluate the OCL against a SampleNode instance that has 2500 predecessors, once I increase the number of predecessors to 5000 I get the following error:
>
>
> Exception in thread "main" java.lang.StackOverflowError
> at java.util.HashMap.getEntry(HashMap.java:344)
> at java.util.HashMap.containsKey(HashMap.java:335)
> at org.eclipse.ocl.AbstractEvaluationEnvironment.add(AbstractEvaluationEnvironment.java:122)
> at org.eclipse.ocl.EvaluationVisitorImpl.visitVariable(EvaluationVisitorImpl.java:2368)
> at org.eclipse.ocl.ecore.impl.VariableImpl.accept(VariableImpl.java:654)
> at org.eclipse.ocl.internal.evaluation.IterationTemplate.initializeIterators(IterationTemplate.java:126)
> at org.eclipse.ocl.internal.evaluation.IterationTemplate.evaluate(IterationTemplate.java:83)
> at org.eclipse.ocl.internal.evaluation.IterationTemplateClosure.evaluateResult(IterationTemplateClosure.java:90)
> at org.eclipse.ocl.internal.evaluation.IterationTemplate.evaluate(IterationTemplate.java:90)
> at org.eclipse.ocl.internal.evaluation.IterationTemplateClosure.evaluateResult(IterationTemplateClosure.java:90)
> at org.eclipse.ocl.internal.evaluation.IterationTemplate.evaluate(IterationTemplate.java:90)
> at org.eclipse.ocl.internal.evaluation.IterationTemplateClosure.evaluateResult(IterationTemplateClosure.java:90)
> at org.eclipse.ocl.internal.evaluation.IterationTemplate.evaluate(IterationTemplate.java:90)
>
>
> Ideally I'd like to evaluate a linked list having up to one million nodes. Can anyone suggest a way of doing this without the memory issues?
>
> Your help/tips would be appreciated.
>
> Thanks.
|
|
|
Powered by
FUDForum. Page generated in 0.06206 seconds