Tag: aws

accessing AWS RedShift with Python Pandas via psycopg2 driver

Melbourne Tweet Cloud


Selection_339

URL: words.yznotes.com

If a picture is worth a thousand words, then the picture on the left should be an equivalent to 1,100 words, at least :)

This App collects tweets posted within 10km of Melbourne CBD, does some Natural Language processing and renders top 100 words into a Word Cloud.
Refreshed every 10 minutes 24/7. Ran on Amazon Web Services.

This is Work in Progress, there will be new features added with time.

Postgres Upgrade from 9.1 to 9.3 and post-upgrade configs (draft notes)

Environment: Postgres 9.1 running on Ubuntu 13.04 on AWS EC2 instance with a single cluster on AWS EBS

Some pre-steps can be found here:
https://wiki.postgresql.org/wiki/Using_pg_upgrade_on_Ubuntu/Debian

Get details of the present clusters:

Stop all Postgres services:

Drop new 9.3 cluster created by default during installation of postgres 9.3:

Create a new 9.3 cluster from existing 9.1 one:

Check if it worked:

Make sure you can access your data: if ok – drop 9.1 cluster:


All steps below are optional: Move data storage to an alternative location
In my case it’s an instance of Amazon EBS (Elastic Storage) mounted to /data/main

Stop all Postgres services:

Move actual directory :

Amend 9.3 configs:

Comment out old data_directory path and point it to the new location:

Check if it worked:

Start postgres:

Optional – remove /var/lib/postgresql/9.3/main created by default during 9.3 installation:

Interactive Data Analysis Setup on AWS

yznotes.com_aws_setup_for_data_analysis


Current setup includes: Linux Ubuntu Server on EC2 (Elastic Computing Cloud) with Postgres and Python Data Analysis Tools (IPython, Numpy, Pandas, etc.) + Elastic IP + Load Balancer + EBS (Elastic Block Store)