Setting a a cloud backup of Drupal website

https://www.drupal.org/node/2465951
I am using this guide to setup a S3 backup for our website. I get a success for testing the connection. But when I try a backup, it fails and says it does not have permission. In the S3 Grant Access I gave it all permissions.

How big of a deal is the “DO NOT USE ADMIN USER” per the post instructions?

Is there anything I can do with my user setup in StorJ so that it has “ListBucket permission”?
"Note the extra permission in the Security Policy. Even allowing the user to do “everything” ("s3:“) in the designated bucket ( “arn:aws:s3:::mycompany-websitebackups/") was not sufficient to give them access to it. The account also needed at least ‘s3:ListBucket’ permission in the root context ("arn:aws:s3:::”) before it could work.”*

Hello @IowaBoy1 ,
Welcome to the forum!

You need to create S3 credentials not limited to the one bucket accordingly your notes, you should create it with an access to all buckets.
Make sure that you provided a gateway.storjshare.io instead of default s3.amazonaws.com
as a Host in the backup configuration.

1 Like

Thanks for the assist.

I deleted the old S3 Grant Access and created a new one for all buckets. I flushed the cache and attempted the backup. No joy.

I changed my Host from us1.storj.io > gateway-storjshare-io (Because I am a new user I cannot post more than 2 links, so I replaced the periods with dashes) Flushed cache and tried again. Still no success.

I tried it without a Subdirectory as well.

I then tried the Host with us1.storj.io/ buckets/ upload/ ( I put in spaces so it displays the URL rather than linking it.

Any other suggestions?

I also checked and cURL is enabled on our server

Do you have some logs from this backup tool?

The Host should be gateway.storjshare.io, without https:// or something other.

From where did you get us1.storj.io? When you generate S3 credentials the wizard will offer you only Access Key, Secret Key and Endpoint. The Endpoint should be https://gateway.storjshare.io. The endpoint https://gateway.us1.storjshare.io should work too, but it’s better to use a gateway.storjshare.io to allow it automatically route to the closest location.

I got the us1 link from the StorJ Docs. I have been trying everything I find out there. And that is the url when I am signed in to my buckets so I thought it might work.

With the gateway… url:
I am getting the following error message in my Drupal logs
User warning : S3::putObject(): [AccessDenied] Access Denied. in S3::__triggerError() (line 440 of …/libraries/s3-php5-curl/S3.php ).

I have tried 2 different versions of the S3.php that are listed in the Backup module github page.

The other S3.php gives me this message
User warning : S3::putObject(): [403] Unexpected HTTP status in S3::__triggerError() (line 429 of …/libraries/s3-php5-curl/S3.php ).

I am thinking that this has something to do with the user permissions referenced in the original post and here. Solve - S3 Access Denied when calling PutObject | bobbyhadz

Could you please give me a link to the documentation? It must be fixed, the S3 Endpoint should not be us1.storj.io in any conditions.

Could you please try to configure AWS CLI with these S3 credentials and try to list buckets, then copy something to there?

Now I cannot find the doc with us1 link. Not sure where I came up with that.

I did the AWS CLI with these S3 crdentials and I was able to list buckets and upload a file via the AWS CLI

So, permissions are ok, then this backup tool requests something not supported.
Could you please enable more extensive logging to see - what exactly it requests? I have a feeling that it ignores Endpoint and uses default AWS, in this case it will not be able to access bucket not on AWS.
What did you specify as a Region in their configuration? Is it allow to use an empty value there?

I found that the Drupal module I was using is hardcoded for AWS only. There is an additional module needed to pigback onto it that allows for S3 compatible storage.
https://www.drupal.org/project/backup_migrate_s3
I have setup that module. With this module I can setup a S3 Compatible Destination. When I do that, I get the following error message.

There was an S3 error attempting to validate the settings below - 0 [curl] 60: SSL certificate problem: certificate has expired [url] https://secureribackups.gateway.storjshare.io?prefix=SecureWebsiteBackups%2F

I tried to create a new S3 Access in StorJ but I still get this message. Any ideas on where the problem might be?

What options do they have in Advanced?
You need a path-style addressing (relative URLs).

Advanced does not open up. It seems dead

Would I may be able to set the path here? This is the destinations php file for the backup module I am using.

<?php

libraries_load('aws-sdk-php');
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;

/**
 * @file
 * Functions to handle the Amazon S3 backup destination.
 */

/**
 * A destination for sending database backups to an Amazon S3 bucket.
 *
 * @ingroup backup_migrate_destinations
 */
class backup_migrate_destination_s3_compatible extends backup_migrate_destination_remote {

  var $supported_ops = array(
    'scheduled backup',
    'manual backup',
    'remote backup',
    'restore',
    'list files',
    'configure',
    'delete',
  );

  var $default_values = array(
    'settings' => array(
      'file_directory' => '',
      's3_timeout' => '',
      's3_proxy' => '',
      's3_region' => '',
      's3_min_part_size' => '',
    ),
  );

  var $s3 = NULL;

  /**
   * S3 getters
   */
  function s3_host() {
    return $this->dest_url['scheme'] . '://' . $this->dest_url['host'];
  }
  function s3_key() {
    return $this->dest_url['user'];
  }
  function s3_secret() {
    return $this->dest_url['pass'];
  }
  function s3_bucket() {
    return $this->dest_url['path'];
  }

  /**
   * Generate a filepath with the correct prefix.
   */
  function s3_path($filename) {
    $path = '';
    if (empty($filename)) {
      $path .= '/';
    }
    if (!empty($this->settings['file_directory'])) {
      $path .= $this->settings['file_directory'];
    }
    if (!empty($filename)) {
      $path .= '/';
    }
    $path .= $filename;

    return $path;
  }

  /**
   * S3 Init.
   **/
  function s3_init() {
    // Establish connection with DreamObjects with an S3 client.
    $config = array(
      'base_url' => $this->s3_host(),
      'key'      => $this->s3_key(),
      'secret'   => $this->s3_secret(),
      //'version'  => '2006-03-01',
    );

if(!empty($this->settings['s3_region'])) {
      $config['region'] = $this->settings['s3_region'];
      $config['signature'] = 'v4';
    }

    if (!empty($this->settings['s3_proxy'])) {
      $config['request.options'] = array(
        'proxy' => $this->settings['s3_proxy'],
      );
    }

    if (!empty($this->settings['s3_timeout'])) {
      $config['curl.options'] = array(
        CURLOPT_TIMEOUT => $this->settings['s3_timeout'],
      );
    }

    $this->s3 = S3Client::factory($config);
  }

  /**
   * Save to to the s3 destination.
   */
  function _save_file($file, $settings) {
    $this->s3_init();

    $options = array();

    if (!empty($this->settings['s3_min_part_size'])) {
      $options['min_part_size'] = $this->settings['s3_min_part_size'] * 1024 * 1024;
    }

    try {
      $this->s3->upload(
        $this->s3_bucket(),
        $this->s3_path($file->filename()),
        fopen($file->filepath(), 'r'),
        'private',
        $options
      );
    }
    catch (S3Exception $e) {
      $e_msg = 'S3 error saving %file - %code %error';
      $e_args = array('%file' => $file->filename(), '%error' => $e->getMessage(), '%code' => $e->getAwsErrorCode());
      drupal_set_message(t($e_msg, $e_args), 'error');
      watchdog('backup_migrate_s3', $e_msg, $e_args, WATCHDOG_ERROR);
    }

    return $file;
  }

  /**
   * Load from the s3 destination.
   */
  function load_file($file_id) {
    backup_migrate_include('files');
    $file = new backup_file(array('filename' => $file_id));
    $this->s3_init();
    try {
      $this->s3->getObject(array(
        'Bucket' => $this->s3_bucket(),
        'Key'    => $this->s3_path($file->filename()),
        'SaveAs' => $file->filepath(),
      ));
    }
    catch (S3Exception $e) {
      $e_msg = 'S3 error loading %file - %code %error';
      $e_args = array('%file' => $file_id, '%error' => $e->getMessage(), '%code' => $e->getAwsErrorCode());
      drupal_set_message(t($e_msg, $e_args), 'error');
      watchdog('backup_migrate_s3', $e_msg, $e_args, WATCHDOG_ERROR);
    }

    return $file;
  }

  /**
   * List all files from the s3 destination.
   */
  function _list_files() {
    $this->s3_init();
    $files = array();
    backup_migrate_include('files');

    try {
      $o_iter = $this->s3->getIterator('ListObjects', array(
        'Bucket' => $this->s3_bucket(),
        'Prefix' => $this->settings['file_directory'] . '/',
      ));
      foreach ($o_iter as $o) {
        $info = array(
          'filename' => basename($o['Key']),
          'filesize' => $o['Size'],
          'filetime' => $o['LastModified'],
        );
        $files[$info['filename']] = new backup_file($info);
      }
    }
    catch (S3Exception $e) {
      $e_msg = 'S3 error listing files - %code %error';
      $e_args = array('%error' => $e->getMessage(), '%code' => $e->getAwsErrorCode());
      drupal_set_message(t($e_msg, $e_args), 'error');
      watchdog('backup_migrate_s3', $e_msg, $e_args, WATCHDOG_ERROR);
    }

    return $files;
  }

  /**
   * Delete from the s3 destination.
   */
  function _delete_file($file_id) {
    $this->s3_init();
    try {
      $this->s3->deleteObject(array(
        'Bucket' => $this->s3_bucket(),
        'Key'    => $this->s3_path($file_id),
      ));
    }
    catch (S3Exception $e) {
      drupal_set_message(t('There was an error deleting the remote file.'));
    }
  }

  /**
   * Validate the edit form for the item.
   */
  function edit_form_validate($form, &$form_state) {
    parent::edit_form_validate($form, $form_state);

    if (!class_exists('Aws\S3\S3Client')) {
      $requirements = backup_migrate_s3_requirements('runtime');
      $e_msg = t('Library not found or cannot be loaded.');
      if ($requirements['backup_migrate_s3']['severity'] == REQUIREMENT_OK) {
        $e_msg .= ' ' . t('You might have to <a href="!cache">clear cache</a> if you installed the library after enabling this module.', array('!cache' => url('admin/config/development/performance')));
      }
      else {
        $e_msg .= ' ' . $requirements['backup_migrate_s3']['description'];
      }
      form_set_error('', $e_msg);
    }
    else {
      // Do not attempt validation of there are erros in the form.
      if (form_get_errors()) return;

      try {
        $config = array(
          'base_url' => $form_state['values']['scheme'] . '://' . $form_state['values']['host'],
          'key'      => $form_state['values']['user'],
          'secret'   => empty($form_state['values']['pass']) ? $form_state['values']['old_password'] : $form_state['values']['pass'],
        );

if(!empty($form_state['values']['s3_region'])) {
          $config['region'] = $form_state['values']['s3_region'];
          $config['signature'] = 'v4';
        }

       if (!empty($form_state['values']['s3_proxy'])) {
          $config['request.options'] = array(
            'proxy' => $form_state['values']['s3_proxy'],
          );
        }

        if (!empty($form_state['values']['s3_timeout'])) {
          $config['curl.options'] = array(
            CURLOPT_TIMEOUT => $form_state['values']['s3_timeout'],
          );
        }

        $this->s3 = S3Client::factory($config);

        $o_iter = $this->s3->getIterator('ListObjects', array(
          'Bucket' => $form_state['values']['path'],
          'Prefix' => $form_state['values']['file_directory'] . '/',
        ));
        foreach ($o_iter as $o) {
        }
      }
      catch (Exception $e) {
        $e_msg = 'There was an S3 error attempting to validate the settings below - %code %error';
        $e_args = array('%error' => $e->getMessage(), '%code' => $e->getCode());
        if (method_exists($e, 'getAwsErrorCode')) {
          $e_args['%code'] = $e->getAwsErrorCode();
        }
        form_set_error('', t($e_msg, $e_args));
      }
    }

  }

  /**
   * Validate the edit form for the item.
   */
  function edit_form_submit($form, &$form_state) {
    $this->settings['file_directory'] = $form_state['values']['file_directory'];
    $this->settings['s3_timeout'] = $form_state['values']['s3_timeout'];
    $this->settings['s3_proxy'] = $form_state['values']['s3_proxy'];
    $this->settings['s3_min_part_size'] = $form_state['values']['s3_min_part_size'];
    $this->settings['s3_region'] = $form_state['values']['s3_region'];
    $this->settings['s3_debug'] = $form_state['values']['s3_debug'];
    $this->settings['s3_retries'] = $form_state['values']['s3_retries'];

    parent::edit_form_submit($form, $form_state);
  }

  /**
   * Element validate callback for the file destination field.
   *
   * Remove slashes from the beginning and end of the destination value and ensure
   * that the file directory path is not included at the beginning of the value.
   *
   * @see _file_generic_settings_file_directory_validate().
   */
  function file_directory_validate($element, &$form_state) {
    // Strip slashes from the beginning and end of $widget['file_directory'].
    $value = trim($element['#value'], '\\/');
    form_set_value($element, $value, $form_state);
  }

  function numeric_validate($element, &$form_state) {
    // Strip slashes from the beginning and end of $widget['file_directory'].
    if ($element['#value'] && !is_numeric($element['#value'])) {
      form_error($element, t('@fieldname must be numeric.', array('@fieldname' => $element['#title'])));
    }
  }

  /**
   * Get the form for the settings for this filter.
   */
  function edit_form() {
    $form = parent::edit_form();
    $form['scheme']['#type'] = 'value';
    $form['scheme']['#value'] = 'https';

    $form['host']['#default_value'] = $form['host']['#default_value'] == 'localhost' ? '' : $form['host']['#default_value'];
    $form['host']['#description'] = t('Enter the S3 compatible host, i.e. s3.amazonaws.com, objects.dreamhost.com, etc.');

    $form['path']['#title'] = 'S3 Bucket';
    //$form['path']['#default_value'] = $this->get_bucket();
    $form['path']['#description'] = 'This bucket must already exist. It will not be created for you.';

    $form['user']['#title'] = 'Access Key ID';
    $form['pass']['#title'] = 'Secret Access Key';

    $form['pass']['#required'] = empty($this->dest_url);

    $form['file_directory'] = array(
      '#type' => 'textfield',
      '#title' => t('File directory'),
      '#default_value' => $this->settings['file_directory'],
      '#description' => t('Optional subdirectory within the s3 bucket where files will be stored. Do not include preceding or trailing slashes.'),
      '#element_validate' => array(array($this, 'file_directory_validate')),
      '#weight' => 25,
    );

    $form['s3_advanced'] = array(
      '#type' => 'fieldset',
      '#title' => t('Advanced'),
      '#collapsible' => TRUE,
      '#collapsed' => TRUE,
      '#weight' => 50,
    );
    
    $form['s3_region'] = array(
      '#type' => 'textfield',
      '#title' => t('S3 region'),
      '#default_value' => $this->settings['s3_region']?$this->settings['s3_region']:'',
      '#description' => t(
        'Optionally specify region which is needed for signature version authentication: !link1, !link2',
        array(
          '!link1' => l('S3 regions', 'https://docs.aws.amazon.com/de_de/general/latest/gr/rande.html#s3_region', array('absolute' => TRUE)),
          '!link2' => l('Read about the region setting', 'https://docs.aws.amazon.com/de_de/general/latest/gr/rande.html#s3_region', array('absolute' => TRUE)),
        )
      ),
      '#weight' => 30,
    );

    $form['s3_advanced']['s3_timeout'] = array(
      '#type' => 'textfield',
      '#title' => t('Timeout'),
      '#default_value' => $this->settings['s3_timeout'],
      '#description' => t('Optional timeout for S3 requests, useful if you are getting timeout exceptions. If empty, the internal default is used, which is normally PHP configuration <em>default_socket_timeout</em>, which is currently set to %value seconds.', array('%value' => ini_get('default_socket_timeout'))),
      '#element_validate' => array(array($this, 'numeric_validate')),
    );
    $form['s3_advanced']['s3_min_part_size'] = array(
      '#type' => 'textfield',
      '#title' => t('Minimum part size'),
      '#default_value' => $this->settings['s3_min_part_size'],
      '#description' => t('Minimum size to allow for each uploaded part when performing a multipart upload, in MegaBytes.'),
      '#element_validate' => array(array($this, 'numeric_validate')),
    );
    $form['s3_advanced']['s3_proxy'] = array(
      '#type' => 'textfield',
      '#title' => t('Proxy'),
      '#default_value' => $this->settings['s3_proxy'],
      '#description' => t('Optional proxy to pass through to the S3 Client library. It should be specified in the <em><b>HOST:PORT</b></em> format.'),
    );

    return $form;
  }}

Also, could you please update ca-certificates on this server?
Because:

According to the docs (Class Aws\S3\S3Client | AWS SDK for PHP 3.x), you should be able to set use_path_style_endpoint => true to the config of S3Client.

Sorry, I am not sure what you mean by “update ca-certificates on this server”. Could you point me in the right direction on what I need to do? Provide a link to a wiki or something on it?

for Debian-based:

sudo apt update
sudo apt install ca-certificates

This should update root certificates on your OS.

This would be on the server where our website is hosted. I will have to ask our host to do this. I will get back to you

1 Like

I am not getting any where on the update certificates. Our hosting pointed out that our certificates are not expired. According to the error message the expired certificate is athttps://secureribackups.gateway.storjshare.io.

They told me I need to get more input from you an what is needed.