QuickTip: Integrating Amazon S3 in your Scala Product


This post is supposed to be a quick cheat sheet of integrating your Scala product with Amazon S3. The pre-requisites are that you have a valid S3 account and have the keys to access the account.

We have put our S3 credentials in our application.conf. We use the Typesafe Config for managing our configurations. So our example entries would be

# Amazon S3 Details
aws.s3.bucket=email-attachments
aws.s3.accesskey={{your-access-key}}
aws.s3.secretkey="{{your-secret-key}}"

Now, we need to have the AWS dependency in the build.sbt file

"com.amazonaws"	%	"aws-java-sdk"		%	"1.3.11"

And then the code is pretty straight forward, where we create the AWS client

object AmazonS3Communicator {
  val logger = LoggerFactory.getLogger(this.getClass().getName())
  val credentials = new BasicAWSCredentials(fetchProperty(propertyName = "aws.s3.accesskey"), fetchProperty(propertyName = "aws.s3.secretkey"))
  val amazonS3Client = new AmazonS3Client(credentials)
  val client = new ClientConfiguration()
  client.setSocketTimeout(300000)

and then use it for various operations on the S3. Let us look at Upload, Delete and checking if a file exists

/**
   * Upload a file to standard bucket on S3
   */
  def upload(meta: ObjectMetadata, stream: ByteArrayInputStream, filename: String): Boolean = {
    try {
      amazonS3Client.putObject(fetchProperty(propertyName = "aws.s3.bucket"), filename, stream, meta); true
    } catch {
      case ex: Exception => logger.error(ex.getMessage(), ex); false
    }
  }

/**
   * Deletes a file to standard bucket on S3
   */

  def delete(fileKeyName: String): Boolean = {
    try {
      amazonS3Client.deleteObject(fetchProperty(propertyName = "aws.s3.bucket"), fileKeyName); true
    } catch {
      case ex: Exception => logger.error(ex.getMessage(), ex); false
    }
  }
/**
   * Checks if the file exists on the  standard bucket of S3
   */

  def doesFileExist(fileKeyName: String): Boolean = {
    try {
      amazonS3Client.getObjectMetadata(fetchProperty(propertyName = "aws.s3.bucket"), fileKeyName); true
    } catch {
      case ex: Exception => logger.error(ex.getMessage(), ex); false
    }
  }
Advertisements

About Vikas Hazrati

Vikas is the Founding Partner @ Knoldus which is a group of software industry veterans who have joined hands to add value to the art of software development. Knoldus does niche Reactive and Big Data product development on Scala, Spark and Functional Java. Knoldus has a strong focus on software craftsmanship which ensures high-quality software development. It partners with the best in the industry like Lightbend (Scala Ecosystem), Databricks (Spark Ecosystem), Confluent (Kafka) and Datastax (Cassandra). To know more, send a mail to hello@knoldus.com or visit www.knoldus.com
This entry was posted in Amazon EC2, Scala. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s