Most Recent Articles
The Story of Us (I = Information)
Most of what we learn about anything comes from outsiders and outside sources. Historians pore over archival primary sources and abandoned documents to examine how the dead truly lived. Memoirists and screenwriters, on the other hand, conjure up scenes from childhood to plot personal narratives of individuals. In both cases, information and archives are key to storytelling. Whether it’s fact-based or persuasive, resources can be used and interpreted to serve many needs.
Archives and Astrology
Archives are like astrology. Astrology is an interpretation.
It is layered onto a myth and then layered onto a logical system that is fully realized, but completely distinct, from the real world. Similarly, an archive is an interpretation layered onto an information architecture and content management system. Whether it has a fully realized taxonomy built from internal preferred terms or not, the system is a protects content and context.
Metadata schema standardizes data effectively. A trusted archival system provides a finding aid to assets that may be completely distinct, and then integrated within the real world.
Calculating I = Information
Information has many meanings as well. Some see it as an age and a commodity. Others like James Gleick are more precise and argued that it is A History, a Theory, a Flood. Gleick wrote an excellent and riveting book on it. Paired with Claude Shannon’s thesis, it’s perfect. Today information is everything. It also is a Cloud. In the age of cloud computing, information surrounds us virtually and in reality.
Information, however, is also an interpretation. Based on a theory of Borge Langefors, the Infological equation states that Information is a function of an interpretation of Data and Pre-Knowledge over time.
I love this equation. It clearly shows how Digital Archivy archivists can help improve your information. By focusing on the Data and the Knowledge of provenance (creators), functions, and objects, and then be establishing or defining the time frame, an institution will gain accurate and valuable information.
Based on knowledge of international standards and familiarity with functional requirements and best practice, it is possible to better interpret the resources in order to identify, protect and amplify the Information.
Applying Information Strategy
Developing a methodology or a tool to help with interpreting data is key. A strategy built upon three main resources will solve this problem:
- clearly identifying and defining data sets
- determining who owns and can share relevant pre-knowledge (provenance)
- tracking time restraints
There is an additional key element of this equation’s system. That is to identify clearly the algorithm used for interpretation. This may seem like a minor esoteric point, but in reality, this is the most important variable of all. Reasoned conclusions can be drawn if the data and pre-knowledge can be trusted Users can take control by sorting the search results by date, author, or title. The results may be the same, but the ordering and display of data may affect the content that is available.
Information is a flood and a cloud and much more. Data surrounded us at all times. Once you successfully identify the data sets and other elements, the component parts will work together to present the accurate information you need. If you need help managing or interpreting your information, please see our clients page.
Read MoreBrooklyn Animation – History & Future
The Amazing and Incredible History and Future of Brooklyn Animation
One year ago , I hosted and curated an event titled “The Amazing and Incredible History and Future of Brooklyn Animation.” And that event was hosted at Brooklyn Historical Society. Surprisingly, it sold out!
Coincidentally, the event was held on the 150th birthday of the father of American animation, Brooklyn animator Winsor McCay. The evening was both educational and inspiring.
It began with a live performance of McCay’s most famous animated cartoon, Gertie the Dinosaur (1914). He produced it in Sheepshead Bay. As a 150th birthday present, John Canemaker performed the interactive parts and read the inter-titles live. Truly it was incredible. He is an Emmy and Oscar award-winning independent animator. Also, he is an author, professor, and animation historian at NYU. As part of the evening’s festivities, he also showed a powerpoint based on his book Winsor McCay: His Life and Art.
Making History
Film collector and curator Tommy Stathes also provided information and insight on some of the early Brooklyn animators. Throughout the evening, he projected clips focused on innovation and inventions in Brooklyn animation—primarily from Max Fleischer and Fleischer Studios. While Mr. Stathes showed clips utilizizng the rotoscope machines patented by Max Fleischer in Brooklyn in 1917, he also projected films from Fleischer Studio with KoKo the Klown, Betty Boop, and Popeye. There were some eye-popping and risque examples of animation at its best. As a special treat, he screened some of the earliest Fleischer Studios sound cartoons. Those were created in NYC prior to The Jazz Singer.
As curator, archivist David Kay also moderated and hosted the Q&A. His first-hand research and interviews helped uncover the pivotal role that Brooklyn played in creating the animation industry. With assistance from Max Fleischer’s granddaughter, he projected the image of the rotoscope patent that Max Fleischer filed in 1917. That patent changed the path of American animation. Years later it even changed video games!
This led to the creation of one of the largest and most productive animation studios in the nation—many years before Walt Disney created Mickey Mouse! Brooklyn’s animated stars include Koko the Clown, Betty Boop, Popeye, and even Superman. The evening showed that both the roots and the future of American animation are found in Brooklyn.
The Future
Consequently, the highlight of the event was the appearance of Brooklyn-based animator Jennifer Oxley. She is a multi-Emmy Award winning Producer of many children’s TV shows and a true visionary. She helped invent “photo-puppetry style” created for The Wonder Pets! TV show on Nick Jr. At this event, Ms. Oxley showed new styles and examples. She also spoke about her career and dreams to be an animator. Fortunately, she also shared new clips from Peg + Cat, the Emmy Award winning show she produces that airs on PBS Kids. Oxley also previewed a clip from her new animated series based on an imagined friendship between Amelia Earhart and Josephine Baker.
Afterwards, panelists participated in a brief question and answer period. Reviews were favorable and, at the end of the event, Ms. Oxley answered additional questions and signed books for a very large audience of shy, young aspiring animators.
We commissioned a poster from artist Tommy Yesterday to honor the event. And the writer, cartoon producer and author Jerry Beck even wrote a brief article about the evening.
If you would like more information on the next series, contact dkay@digitalarchivy.com.
Read MoreCloud Computing
Once The ongoing enthusiasm for Cloud computing and platform as a service storage based solutions, continues to increase. For years, IT professionals repeated their mantra that “storage is cheap [so let’s] save it all!” From IT’s perspective, data is superfluous when their strategy is to save everything. Simply put though, cloud computing has become pervasive because it provides easy flexibility and scalability. It offers cost-effective solutions and a number of other indisputable short-term gains. These include:
- increases efficiency
- scales up easily
- cuts out high cost of hardware
- automates back-up and disaster recovery plans
- facilitates employee collaboration on documents
- provides cost-effective pay-as-you-go subscription-based models
Undoubtedly, these are significant and valuable benefits. But there are challenges and costs in creating sensible digital archiving plans. Cloud computing may seem like the only answer, but it may not be the best answer. Benign neglect and automated uploads or backups may cause more widespread, senseless anxiety. Ostensibly, it may encourage digital hoarding. It may, in fact, be time to step away from the pack and question the wisdom of transferring your assets to the cloud.
Cloud Computing Usage, Storage and Energy Consumption
Data use and storage require electricity that comes primarily from burning fossil fuels. While coal- and oil-based electricity powers computers, it also enables data transmissions across and between networks. And it supplies energy to store and process an always-growing mountain of data. Though the server farms that power cloud computing are growing, their need for electricity for power is increasing and unceasing. According to McKinsey & Company, data centers will produce more carbon emissions than the entire airline industry by 2020.
Given what we know about digital archivy, the idea that no one (neither stakeholders nor IT) is making appraisal decisions prior to migrating to the cloud is shocking. Without understanding how record creators and end-users interact and relate through their records, it’s difficult to design an efficient and effective system. Lack of awareness of your institution’s organization and departmental needs, as well as pie in the sky thinking and indifference, may lead to vendor lock-in. Uploaded files are out of your custody. You’ve lost control. And you may be paying for storage for items that are rot (redundant, outdated, and trivial).
Benefits of Cloud Computing
The benefits offered by cloud computing exacerbate three significant problems. First, the signal-to-noise ratio in cloud storage will decrease significantly as more and more low-value data accumulates. Intellectual property must be actively managed and controlled. These include documents, artwork, reports, calendars, marketing materials, photographs, and audiovisual recordings. We are creating a very large haystack around a much smaller number of needles.
Second, inefficient application search features will bog down due to large data sets. Performance issues in cloud computing often are due to application design and the enabling technology, not the cloud infrastructure. Unfortunately, institutions often move files to the cloud without first evaluating system design. Databases, middleware, and other technologies will fail without an understanding of content and context. Governance is an important component because controlled vocabularies evolve over time. Data is less accurate and effective due to the growing size of the rotted data set in the cloud-computing environment.
And third, long-term data retention and issues such as format compatibility and version control will continue to be pain points for management systems. Version control is a tool to ensure that final approved versions are archived. Interoperability and backwards compatibility of file formats often are necessary for institutions, and cloud computing can provide those services online. This will ensure that archived files that can be located and downloaded as needed. However, verification of authenticity of the file becomes complicated in the cloud.
Conclusion
Critical IT infrastructure will often migrate into the cloud. There are tradeoffs, but without physical custody and with changeable metadata, there is a loss of control. Enterprise computing integrates a number of moving parts. Institutions are learning to test and fix as they go. Vendor lock-in is a big risk as assets are migrated into the infrastructure. If they do not control their data, they will not control the information architecture.
Digital archivy provides practical, cost-effective solutions. If a cloud-computing storage strategy seems too easy and attractive, it may be time to conduct a digital survey and inventory your assets. With forethought and a sound strategy, we can evaluate options and select a cheaper, more effective and secure long-term plan for preserving information and digital assets.