Just to recap, our project entailed taking Zoobook data converting it to text. The text was then used to map the data Points on ArchGIS. This plan was simpler said than done as we encountered a host of problems and roadblocks along the way. Read all about it here
3D model and the appropriateness of method to a project.
Over the past week, we have delved deeper into the world of 3D modeling. We have explored programs such as CityEngine, with its perceptual generation and we have toyed with photogrammetry programs such as Photoscan. After experiencing the two new types of modeling options and remembering experiences with simple programs such as SketchUp there are a few quick questions that come to mind. What are the pros and cons of each of these options? What is the best use of each of these different approaches?
Procedural modeling is best used for the creation of Citys and towns in which high detail is not required. All that one needs is to create a starting code for its instructions on how to model each of the points provided. Once the program his been created (may take some time) all that one needs to do enter the data and the software does the work.
This is good when you need to model a lot of objects with relatively similar appearance.
Lack of detail. All the buildings will come out looking the same a relatively flat in appearance. the uniform structure caused by the modeling engine. Bad at detail and historical accuracy
Easy to Create. This process creates a highly realistic and authentic replica of the original item. This system is also easy and fast to use making it fantastic for museum collections or other objects with easy access.
Only useable if you have unobstructed access to the object. It does not work for items that no longer exist or you wish to reproduce in a different state then they are now.
Authentic replication of a product. The original object does not need to still exist.
Time-consuming. Can take days to create even one project in high detail.
Examination of existing project, Marie Saldana
Marie Saldana’s Rome project features a fantastic example of procedural modeling from computer code. The Roman city through this project can be fully viewed and respected in awe as a whole. The sense of scale expressed in this project is also impressive. Where the project falls short however is in its detail, most of the building feel flat and without variation. This flatness was to be expected as it was all modeled by a computer using the same code for each building.
Final Project Proposal
Our group, consisting of Chris J, Chris L and Lawrence Lin, is going to develop an interactive timeline map of Carleton Student Statistics, with a primary focus on where the students are coming from throughout our history with other corresponding statistics such as gender, major, religious affiliation and athletic information.
We hope to find the data we’re looking for in the Carleton Archives, Carleton Admissions, and with help of the the Office of the Deans. We are assuming they will at least have the hometown and major of each admitted student as well as the other information that we are seeking out. We obviously are going to be dealing with massive amounts of data and we hope to use a relational database to store and manipulate our data. We’re hoping that Carleton has digitized their archives but if not we will be looking at a lot of documents and will need to use a scanner to convert them into a PDF and a PDF reader to get the data we need from them. Once we have a relational database set up we’re thinking we want to use the ArcGIS platform to create our interactive timeline map that will offer the user information about each Carleton student and the geographical history of admitted students.
Were going to try and break down our project into three weeks. By the end of week 6 we want to have access to and have digitized all the data that we need to run this project. That includes having scanned any documents that we plan on using. By the end of week 7 we wan to have out relational database set up and able to store our massive amount of data. By the end of week 8 we want to have a rough cut of the final project with the rest of week 9 to clean up the site and figure out our bibliography.