1

I'm getting error when i try indexing my document using laravel.

This is my main code. By using die statements, I came to know that i'm getting this error "[MongoDB\Driver\Exception\InvalidArgumentException] Integer overflow detected on your platform: 300000000000" as soon as it executes first line : $users = User::all();

$users = User::all();
foreach ($users as $user) {
        $temp=$user['attributes']; 
        unset($temp['_id']);
             $params = [
              'index' => 'test_index',
              'type' => $temp['vehicle_type'],
          'id' => $temp['lid'],
          'body' => $temp
        ];
        print_r($params); die;
     $response = $client->index($params);
     set_time_limit(100);
 }
    }``

I am using https://github.com/jenssegers/laravel-mongodb to interface Laravel and mongoDB. My User model looks like this

use Jenssegers\Mongodb\Eloquent\Model as Eloquent;
class User extends Eloquent 
{
    protected $connection = 'mongodb';
    protected $collection = 'Test4';  
}

Test4 contains big data. However, i've made sure i don't have any integer in my mapping that might cause integer overflow error. Kindly help me out. I am new to Laravel and MongoDB I would be happy to provide any further info that might be required.

Also when i try to decrease No. of fields in mapping and indexing, i get this error " PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 40 bytes) in C:\xampp\htdocs\ProTest1\vendor\jenssegers\mongodb\src\Jenssegers\Mongodb\Query\Builder.php on line 392"

  • 2
    How "big" is the collection? `User::all()` would imply you are asking for "all" the content. That is likely not a wise thing to do on anything beyond a "small" size. In fact [`all()`](https://laravel.com/docs/5.4/collections#method-all) seems to imply that this would turn the "cursor" into an "array", and that really would not be wise. – Neil Lunn Jun 20 '17 at 07:14
  • 1
    It seems you are trying to store a bigdata type into integer, so you are getting this exception – RAUSHAN KUMAR Jun 20 '17 at 07:20
  • Neil Lunn, Thanks for replying. Actually data contains around 75k documents that accounts for around 7gb space. – Ankit Chandel Jun 20 '17 at 08:55
  • RAUSHAN KUMAR, Thanks for replying. I haven't used integer to store any index. I changed all my integer to text when i got this error, but error still persists. – Ankit Chandel Jun 20 '17 at 08:59

1 Answers1

0

Thanks Neil Lunn, your feedback really helped. Actually i was accessing all the data at a time which was consuming large memory. So instead, i tried extracting chunk of data at a time, using below code, which worked.

User::chunk(100, function ($users) {
    foreach ($users as $user) {
    $temp=$user['attributes']; 
    unset($temp['_id']);
         $params = [
          'index' => 'test_index',
          'type' => $temp['type'],
          'id' => $temp['lid'],
          'body' => $temp
        ];
     $client = Elasticsearch::create()->build();
     $response = $client->index($params);
  }
});