0

I have three identical Laravel projects on the same server and a sperate database for queues and jobs.

Projects

  1. Project A
  2. Project B
  3. Project C

Databases

  1. Project A database
  2. Project B database
  3. Project C database
  4. Laravel Queue Database

The queue is always running in Project A

php artisan queue:work

I have already set the queue and job to use a single queue and I can add jobs from Project B and C to Project A queue and run it

Problems

  1. When I want to upload or add a file it adds in Project A even if it belongs to Project B or C.

  2. I am using spatie roles and permissions. When I add permissions via job it says permission not found (it checks project A).

I've searched Google and also asked ChatGPT but found very few answers which were very old.

My .env

QUEUE_CONNECTION=database

# start these belongs to queues and jobs
QUEUE_DB_HOST="localhost"
QUEUE_DB_PORT="3306"
QUEUE_DB_DATABASE="laravelqueue"
QUEUE_DB_USERNAME="userName"
QUEUE_DB_PASSWORD="UserPassword"
QUEUE_FAILED_DB_CONNECTION=queue
# end these belongs to queues and jobs

My config/database.php

'connections' => [

    'mysql' => [
        'driver' => 'mysql',
        'url' => env('DATABASE_URL'),
        'host' => env('DB_HOST', '127.0.0.1'),
        'port' => env('DB_PORT', '3306'),
        'database' => env('DB_DATABASE', 'forge'),
        'username' => env('DB_USERNAME', 'forge'),
        'password' => env('DB_PASSWORD', ''),
        'unix_socket' => env('DB_SOCKET', ''),
        'charset' => 'utf8mb4',
        'collation' => 'utf8mb4_unicode_ci',
        'prefix' => '',
        'prefix_indexes' => true,
        'strict' => false,
        'engine' => null,
        'options' => extension_loaded('pdo_mysql') ? array_filter([
            PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
        ]) : [],
    ],


    'queue' => [
        'driver' => 'mysql',
        'url' => env('DATABASE_URL'),
        'host' => env('QUEUE_DB_HOST', '127.0.0.1'),
        'port' => env('QUEUE_DB_PORT', '3306'),
        'database' => env('QUEUE_DB_DATABASE', 'forge'),
        'username' => env('QUEUE_DB_USERNAME', 'forge'),
        'password' => env('QUEUE_DB_PASSWORD', ''),
        'unix_socket' => env('QUEUE_DB_SOCKET', ''),
        'charset' => 'utf8mb4',
        'collation' => 'utf8mb4_unicode_ci',
        'prefix' => '',
        'strict' => true,
        'engine' => null,
        'options' => extension_loaded('pdo_mysql') ? array_filter([
            PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
        ]) : [],
    ],

],

My confiq/queue.php

return [

    'default' => env('QUEUE_CONNECTION', 'sync'),

    'connections' => [

        'sync' => [
            'driver' => 'sync',
        ],

        'database' => [
            'driver' => 'database',
            'table' => 'jobs',
            'queue' => 'default',
            'retry_after' => 90,
            'after_commit' => false,
            'connection' => 'queue'
        ],

    ],

    'failed' => [
        'driver' => env('QUEUE_FAILED_DRIVER', 'database-uuids'),
        'database' => env('QUEUE_FAILED_DB_CONNECTION', 'queue_database'),
        'table' => 'failed_jobs',
    ],

];

My job

class AppendLogToFile implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    private $activity;

    public function __construct($activity)
    {
        $this->activity = $activity;
    }

    public function handle()
    {
        $date = $this->activity['date'];
        $company_id = $this->activity['Company_id'];
        $path = '/activities/' . $company_id . '/';
        $logs = json_decode(Storage::get($path . $date . '.json'), true) ?? [];
        $logs[] = (object) $this->activity;
        Storage::put($path . $date . '.json', json_encode($logs));
    }
}

My second job

namespace App\Jobs;

use App\Models\SystemFeatures;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldBeUnique;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\DB;
use Spatie\Permission\Models\Role;

class GivingPermissionJob implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    private $role;

    private $permissionID;

    public $tries = 50;

    public function __construct($role, $permissionID)
    {
        $this->role = $role;
        $this->permissionID = $permissionID;
        $this->onQueue('high');
    }

    public function handle()
    {
        DB::setDefaultConnection(config('DB_CONNECTION'));
        $this->role->givePermissionTo($this->permissionID);
    }
}
matiaslauriti
  • 7,065
  • 4
  • 31
  • 43
Mohammad Edris Raufi
  • 1,393
  • 1
  • 13
  • 34
  • 1
    could you run different queue workers for each project by using different 'queue' settings for each project ('project1', 'project2', ...; instead of 'default')? – lagbox May 30 '23 at 16:09
  • if I change the settings back to the original yes. but now I have configured the default of all to the first project – Mohammad Edris Raufi May 30 '23 at 16:16
  • 1
    perhaps independent queue workers for each application would allow each job to be ran by the application that created it? – lagbox May 30 '23 at 16:24
  • independent queue workers workers perfectly but if i run multiple workers at same at and keep them running they will use a lot of resources – Mohammad Edris Raufi May 30 '23 at 16:27
  • 1
    You should have different storage disks defined for each project and do `Storage::disk("whatever")->put(...)`. See https://laravel.com/docs/10.x/filesystem#obtaining-disk-instances – miken32 May 30 '23 at 16:54
  • 1
    I am very confused... Why are you sending jobs from other projects (do not know what you mean with identical, so then why have 3?) to Project A? Project A should process Project A info, Project B, should process Project B info, and Project C should process Project C info... if they are exactly the same, why are they split into 3? If it is because of resources, then you should have a single project and use a load balancer. Of ourse you should also be sharing the same storage so they all can see the files, but a Load Balancer would be your solution. You need to explain more why you have this – matiaslauriti May 30 '23 at 19:49
  • thanks for mentioning load balance could also please give me a link so i could learn more about load balancer – Mohammad Edris Raufi May 30 '23 at 19:55
  • @MohammadEdrisRaufi this is how AWS (Amazon Web Services) provides and handles a Load Balancer: https://aws.amazon.com/elasticloadbalancing/. The main idea is, you set a pool of machines available to do the same work, in your case a single project, but multiple EC2 instances (how AWS call their virtual machines). You have everything configured the same, the idea is that if you have, for example, 50 requests per second, and you have 5 machines, the load balancer should send 10 requests to each EC2, so they do not saturate over. And then you have another machine not public just for queues – matiaslauriti May 30 '23 at 20:25

0 Answers0