0

We have below DataSource and we have below script.

robin    = graph.addVertex(label,'person','name','robin');
jeff     = graph.addVertex(label,'person','name','jeff');
andrew   = graph.addVertex(label,'person','name','andrew');

robin.addEdge('Communication',jeff,'date','2017-12-11T00:00:00Z','weight',1);
robin.addEdge('Communication',jeff,'date','2017-12-10T00:00:00Z','weight',1);
robin.addEdge('Communication',jeff,'date','2017-12-09T00:00:00Z','weight',1);
robin.addEdge('Communication',andrew,'date','2017-12-11T00:00:00Z','weight',1);
robin.addEdge('Communication',andrew,'date','2017-12-10T00:00:00Z','weight',1);
robin.addEdge('Communication',andrew,'date','2017-12-09T00:00:00Z','weight',1);

jeff.addEdge('Communication',robin,'date','2017-12-11T00:00:00Z','weight',1);
jeff.addEdge('Communication',robin,'date','2017-12-10T00:00:00Z','weight',1);
jeff.addEdge('Communication',robin,'date','2017-12-09T00:00:00Z','weight',1);
jeff.addEdge('Communication',andrew,'date','2017-12-11T00:00:00Z','weight',1);
jeff.addEdge('Communication',andrew,'date','2017-12-10T00:00:00Z','weight',1);
jeff.addEdge('Communication',andrew,'date','2017-12-09T00:00:00Z','weight',1);
andrew.addEdge('Communication',robin,'date','2017-12-11T00:00:00Z','weight',1);
andrew.addEdge('Communication',robin,'date','2017-12-10T00:00:00Z','weight',1);
andrew.addEdge('Communication',robin,'date','2017-12-09T00:00:00Z','weight',1);
andrew.addEdge('Communication',jeff,'date','2017-12-11T00:00:00Z','weight',1);
andrew.addEdge('Communication',jeff,'date','2017-12-10T00:00:00Z','weight',1);
andrew.addEdge('Communication',jeff,'date','2017-12-09T00:00:00Z','weight',1); 

we run the script below.

g.V().has('name','robin').as('v').
  repeat(outE().as('e').otherV().as('v').simplePath()).
    until(has('name','jeff')).
  store('a').
    by('name').
  store('a').
    by(select(all, 'v').unfold().values('name').fold()).
  store('a').
    by(select(all, 'e').unfold().
       store('x').
         by(union(values('weight'),
                  select('x').count(local)).fold()).
       cap('x').
       store('a').
         by(unfold().limit(local, 1).fold()).unfold().
       sack(assign).
         by(constant(1d)).
       sack(div).
         by(union(constant(1d),
                  tail(local, 1)).sum()).
       sack(mult).
         by(limit(local, 1)).
       sack().sum()).
  cap('a') 

Got the result: ==>[jeff,jeff,jeff,jeff,jeff,jeff,jeff,jeff,jeff,jeff,jeff,jeff,[robin,jeff],[robin,jeff],[robin,jeff],[1],1.0,[1,1],1.5,[1,1,1],1.8333333333333333,[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[robin,andrew,jeff],[1,1,1,1,1],2.283333333333333,[1,1,1,1,1,1,1],2.5928571428571425,[1,1,1,1,1,1,1,1,1],2.8289682539682537,[1,1,1,1,1,1,1,1,1,1,1],3.0198773448773446,[1,1,1,1,1,1,1,1,1,1,1,1,1],3.180133755133755,[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1],3.3182289932289937,[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1],3.439552522640758,[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1],3.547739657143682,[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1],3.6453587047627294]

But we would like to group by same path and sum the score of the same path. Any suggestion on it? Now current script we have the same path like [robin,andrew,jeff] but they are separated.

As Result we want to get result as below: [[robin,jeff]=4.3333333333333333,[robin,andrew,jeff]=27.85604971]

Stanislav Kralin
  • 11,070
  • 4
  • 35
  • 58
Jeff
  • 117
  • 10
  • 1
    Can you please add an example of how you expect the result to look like? Also, I don't get why you pollute the result with all the things you store in `a` (`[jeff,jeff,jeff,jeff...`). The formula to calculate the weight for a single path would help as well. – Daniel Kuppitz Jan 22 '18 at 16:50
  • I gave the result example. [[robin,jeff]=4.3333333333333333,[robin,andrew,jeff]=27.85604971] – Jeff Jan 23 '18 at 00:07
  • What about the formula? I don't get how you get to 4.33333. My assumption was that you would expect 1.8333 for `[robin,jeff]`. – Daniel Kuppitz Jan 23 '18 at 15:07
  • yes. we got the issue we have. Thanks Daniel. – Jeff Jan 25 '18 at 10:25

0 Answers0