How to handle or optimize large tabular data returned using the ajax callback?


Having a simple form with few input fields. After clicking on ajax callback submit button, I’m returning the tableselect element containing large number of records and represented using datatables


    $form['table'] = array(
    '#prefix' => '<div id="scrollable" class = "col-md-8"><h4>View Data</h4>',
        '#type' => 'tableselect',
         '#suffix' => '</div>',
        '#header' => $header,
         '#options' => $options,
        '#disabled' => $disabled,
        '#attributes' => array('id' => 'sort-table'),
        '#empty' => t('Select criteria and search data'),

//callback function
   public function setMessage(array $form, FormStateInterface $form_state) {
  $options = getOptions(); //gets the associative array of many records
  $form['table']['#options'] = $options;
  return $form['table'];

//jquery to represent data using datatables

    Drupal.behaviors.myBehavior = {
    attach: function (context, settings) {

    if ( $.fn.dataTable.isDataTable( '#sort-table' ) ) {
    table = $('#sort-table').DataTable();
else {
    table = $('#sort-table').DataTable( {

    } );

$options within setMessage() is an associate array of large number of records form the couchdb.(about 20k). I’ve verified the execution time using microtime(true) only to notice that it hardly takes a second to get the indexed data from the couchdb view.

But below are the exceptions noticed in the logs:

  • DrupalCoreDatabaseDatabaseExceptionWrapper: SQLSTATE[HY000]:
    General error: 2006 MySQL server has gone away: UPDATE
    {users_field_data} SET access=:db_update_placeholder_0nWHERE uid =
    :db_condition_placeholder_0; Arrayn(n [:db_update_placeholder_0]
    => 1530155615n [:db_condition_placeholder_0] => 1n)n in /var/www/html/drupal/core/modules/user/src/UserStorage.php on line 59

  • DrupalCoreDatabaseDatabaseExceptionWrapper: SQLSTATE[HY000]:
    General error: 2006 MySQL server has gone away: CREATE TABLE
    {cache_config} (ncid VARCHAR(255) BINARY CHARACTER SET ascii
    COLLATE ascii_general_ci NOT NULL DEFAULT ” COMMENT ‘Primary Key:
    collection of data to cache.’, nexpire INT NOT NULL DEFAULT 0
    COMMENT ‘A Unix timestamp indicating when the cache entry should
    expire, or -1 for $

  • Got a packet bigger than ‘max_allowed_packet’ bytes

All which point to insufficient configuration settings pf mysql or any

I’ve noticed from few other posts containing similar mysql errors and tried varying the packet limit (my.cnf) and increasing the execution time (php.ini)

My question here is, whats happening behind the scenes while I’m trying to update the page using ajax callback?

Is the whole page with data being cached to drupal database in mysql?
And it seems the execution time and memory is being spared for this action?

Any reasonable suggestions on how such cases can be handled when the requirement is to diplay huge number of records on the same page using ajax callback?

It seems there are no such exceptions (even though the execution time is more), when the data is displayed on a page by sticking in the #options for $form[‘table’] without using the ajax callback

قالب وردپرس