DTW Final Ancient Port and Shipwreck Map

 

 

This project was created by Will Richards, Tom Choi, and David Coleman. We are all seniors at Carleton, and we’ve all enjoyed this class. There are two primary elements to the project: a map of the discovered shipwrecks from 0-1500 AD, and a series of historical summaries of nine ancient port cities. Also included are a brief analysis of the dataset and the trends within it along with a description of the processes we made use of in the creation of this project. Enjoy!

David, Tom, and Will

Post 8 – DTW Final Project Update

Excerpt from our timeline of deliverables:

  • By the end of Week 6: Have our data cleaned and uploaded to a MySQL database
  • By the end of Week 7: Have the data connected to the map, with the interface existing, if not polished

 

Progress: What have you done so far, who have you talked to, what have you gathered, and what have you built?

Our first point of action was to clean our dataset. We determined which variables and information we wanted to display and store, and which variables and information  we would throw out.  We then designed a relational database to store our dataset with minimum redundancy.  From there we connected to our MySQL database and created an XML file containing our data in an appropriate structure.  Next, we embedded a Google Maps window into the main page of our website.  Lastly, we created ‘markers’ for our map, data structures which currently contain the geolocation and comments from each ship. These markers are displayed on our map.

 

Problems (and proposed solutions): What issues have you run into?

      • Issues loading in unsupported characters to our xml file
        [
        Fixed by using utf8_encode()]
      • Syntax issues while generating our google maps marker from our XML, but we forced our way through it

Have they forced you to change your initial plan?

Our initial plan is still on track.  We have plenty of time to explore exactly how we want to present our data and site, so none of our plans have changed.

Do you have a proposed solution or do you need help formulating one?

n/a

 

Tools and techniques: What applications/languages/frameworks have you selected and how are you going to implement them?

      • MySQL relational database to store our data (implemented)
      • Generate XML file from MySQL database with php script (implemented)
      • Even with our utf8 encoding, some characters don’t appear on Google maps like we would like them to.  Perhaps we could do more thorough data cleaning to replace troublesome characters in a python script before we ever load those characters into an xml file. If we have problems, we will come to you.

 

Deliverables: An updated timeline of deliverables

  • By the end of Week 6: Have our data cleaned and uploaded to a MySQL database
  • By the end of Week 7: Have the data connected to the map, with the interface existing, if not polished
  • By the end of Week 8: Have performed analysis on our data and begun to incorporate that analysis into our web app in the form of graphics and statistics
  • By the end of Week 9: Have finished both researching and incorporating the featured shipwrecks
  • By the end of Week 10: Have the entire project complete and live

(no change)

Is your project still on track?

Yep!

Post 5 – Group and Project

Our group – Will Richards, Tom Choi, and David Coleman – would like to create an interactive map of the world’s recorded shipwrecks from AD 1-1500. We hope to include information such as the date the ship was wrecked, the date it was discovered, the location of each shipwreck, and the contents of each ship. These parameters might be limited by the robustness of our data, but we will discover soon if that is the case. Our data come from the Digital Atlas of Roman and Medieval Civilizations Scholarly Data Series in the form of the Summary Geodatabase of Shipwrecks AD 1-1500, current as of 2008. There are ~1000 observations in the database, each with most of the variables listed above. Certain variables look more reliably reported than others, but after some data restrictions and cleaning, each observation will be fit for analysis.

That analysis will include details of cargo change over time, geographic regions with greater than average shipwrecks, and periods of time in history with greater numbers of recorded shipwrecks. It is important to note that a crucial limitation of using recorded historical data like this is that we have neither a random sample nor the whole population of shipwrecks. We are working with the shipwrecks whose records and locations were recorded well enough that they were registered in this database, and there are very likely clear biases in the data as a result of that. There will almost certainly be a bias for ships coming from civilizations with better record-keeping (such as the Roman Empire, for example). This bias does not invalidate the inferences we will make with our data, it merely restricts the scope of those inferences. Any claims we make then be about ships similar to those in our dataset – including whatever trends we may end up finding there.

Before we do anything with our data, we first need to clean it to allow us to better categorize it by its cargo.  The cargo column cells have many repeated single words (such as amphoras, silver, swords, ceramic, etc). Other fields have long text values describing the cargo in prose.  We will decide on some number (to be determined) of discrete categories to group our cargo in, and classify text values as an enumerated category. In this way, we will be able to store our data as a relational database.

Once our data is cleaned and categorized, we can decide on which aspects of the data do we want to be able to filter or sort by.  We can then begin integration between our web-map and MySQL database.  Once our web map is complete, we can begin building other aspects of our website, such as a ‘featured wrecks’ page, ‘about us’ page, etc.

 

As far as a timeline of deliverables, our plan is:

  • By the end of Week 6: Have our data cleaned and uploaded to a MySQL database
  • By the end of Week 7: Have the data connected to the map, with the interface existing, if not polished
  • By the end of Week 8: Have performed analysis on our data and begun to incorporate that analysis into our web app in the form of graphics and statistics
  • By the end of Week 9: Have finished both researching and incorporating the featured shipwrecks
  • By the end of Week 10: Have the entire project complete and live

 

This is the link to the project.

This interactive ‘ikiMap” gives a good idea of the project we are planning to attempt. It is an interactive map of the sunken ships of the Great Lakes of North America.

Henceforth, our group tag will be “DTW”

css.php