Ibexa DXP Discussions

Community discussion forum for developers working with Ibexa DXP

[platformsh][Solr] Memory leak in a Content creation script

Bonjour, Bonjour,

I have a memory leak problem with a content creation script.

PHP Fatal error:  Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes) in /app/ezplatform/vendor/doctrine/dbal/lib/Doctrine/DBAL/Statement.php on line 109

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes) in /app/ezplatform/vendor/doctrine/dbal/lib/Doctrine/DBAL/Statement.php on line 109
07:38:23 CRITICAL  [php] Fatal Error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes)
[
  "exception" => Symfony\Component\Debug\Exception\OutOfMemoryException {
    #message: "Error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes)"
    #code: 0
    #file: "./vendor/doctrine/dbal/lib/Doctrine/DBAL/Statement.php"
    #line: 109
    #severity: E_ERROR
  }
]
[]

In Statement.php line 109:
                                                                                           
  [Symfony\Component\Debug\Exception\OutOfMemoryException]                                 
  Error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes)  

The problem is this product on platformsh. Not on my dev env initialized with eZLaunchpad.

If I put an ini_set('memory_limit', -1); the script ends prematurely with a very sober message: Killed

If I enable the debug (-vvv) I notice that there is a lot of writing in the ezsearch_object_word_link table.

Solr is well activated on my dev env. So I don’t have those logs there.
So my problem probably comes from my conf solr on platformsh.

But maybe there is something I can do in my code to optimize it?
My code looks like this:

$list = []; // very long list of remote_id ( ~ 10 000)
foreach ($list as $remote_id)
{
    $content = $this->getContentByRemoteId($remote_id); // No Exception if not exist
    if (!$content) {
        $data = $this->getDataWithCurl($remote_id); // curl function with curl_close($ch);

        $contentType = $contentTypeService->loadContentTypeByIdentifier('article');
        $contentStruct = $contentService->newContentCreateStruct($contentType, 'fre-FR');
        $contentStruct->remoteId = $remote_id;

        $contentStruct->setField('title', $data['title']);
        // ...

        $locationCreateStruct = $locationService->newLocationCreateStruct( $parent_location_id );
        $draft = $contentService->createContent( $contentStruct, array( $locationCreateStruct ) );
        $contentService->publishVersion( $draft->versionInfo );
    }
}
1 Like

Hello Remy,

Could this help? Check https://github.com/janit/ez-platform-cloud-solr-config. It’s an example based on eZ Platform v1.13, Platform.sh and Solr 6.6.

1 Like

Facing exactly same problem. I already posted a question Allowed memory size bytes exhausted: eZ platform import contents
@remy_php @remy_php could you please give me any solutions that can solve the memory problem …creating an eZ content without performing indexing ? or configure platform sh in same way. By the way my conf is same like the https://github.com/janit/ez-platform-cloud-solr-config

Salut @ahmed-bhs,

My .platform.app.yaml

relationships:
    search: "solr:collection1" # [Nom du service]:[Nom du endpoint] dans .platform/services.yaml 

My .platform/services.yaml

solr:
    type: 'solr:6.3'
    disk: 2048
    configuration:
        # https://docs.platform.sh/configuration/services/solr.html#solr-6
        configsets: # https://docs.platform.sh/configuration/services/solr.html#configsets
            mainconfig: !archive "solr/ez" # Config par défaut
        endpoints:
            collection1:
                core: collection1
        cores:
            collection1:
                conf_dir: !archive "solr/collection1" # Config spécifique au core collection1.

But in fact I think I simply split my import into several batches.

Import 1000 by 1000. Every half hour.

With a select that doesn’t return the contents already imported.