Artifacts in the Archives

The following slides and text are from a presentation at the Society of Florida Archivists/Society of Georgia Archivists Joint Annual Meeting in Savannah, GA on October 14, 2016.

The full data-set can be downloaded here.

slide01So with this transition from the grant-funded project to our regular UF operations, I was tasked with creating a processing plan for what remained unprocessed from the museum collection. This included a large assortment of artifacts and artwork, along with your more standard archival documents and photographs. John and I met with the three members of the Panama grant project team and went over the work they had done so far and tried to gain an understanding of their processes and the work that had been completed. While the ways in which processing had been done over the previous 2 years with the project did not meet with how we would have preferred the work to be completed – and somewhat complicated things from our archival viewpoint – it was decided to continue in the same manner to assure that the entire collection got processed and, while it wasn’t ideal, keeping with the earlier practices would at least create fairly consistent control and description of the collection.

slide02I set out to create the processing plan – what actually was my first ever solo processing plan – by surveying the collection holdings at our off-site storage facility where the majority of the unprocessed items and records were held. The results of the survey showed that we had about 7 linear feet of archival documents; over 200 framed art pieces, maps, and similar works; and almost 4,000 artifacts left to process. Now, I’m well-trained in archival processing and come from a long line of MPLP-style work, having received my early hands-on processing training with one of the PACSCL Hidden Collections projects in Philadelphia. I keep stats on my own processing whether administrators request it or not, and I’ve implemented some of the same processes for metrics tracking at UF. So, I was pretty secure in estimating the needs for processing those 7 linear feet of archival records and photographs.

slide03What I wasn’t sure about was how to estimate processing of the art and artifacts. At PACSCL, we dealt with a small number of artifacts and tended to keep them within the archival collections. I also worked with the National Park Service for about a year, but there, artifacts were removed and processed by someone else. I headed to the web, as you do, to look for information on processing times for artifacts, but didn’t coming up with anything of much use. The Park Service has a lot of information on how to budget money for artifact processing, but doesn’t include information about time in their manuals. There was scant information available from other sources, so I ended up making an educated guess and crossed my fingers (in the end, I guessed a bit too low).

slide04But, this made me question – with our love of stats and assessment – why aren’t some general numbers for artifact processing available somewhere.

slide05I posed this question to John and he agreed. He had looked for this type of data before and found very little. He recalled a few times in the past where archivists or other professionals would pose this question to the SAA listserv and noted that they would generally be met with responses noting the unique nature of artifacts and how one couldn’t possibly generalize processing times for artifacts or artwork. Having learned how archivists used to say this all the time about our own paper collections, but knowing that we somehow managed to move on to the understanding that minimal processing usually takes around 4 hours per linear foot and item-level processing tends to take 8 to 10 hours per foot, I thought, we can do better. And with the advice and encouragement of my dear supervisor … a research project was born.

Along with John, I formed a small but professionally-diverse group of people including Lourdes and Jessica, John’s highly knowledgeable wife Laura Nemmers, and a colleague from the Ringling Museum in Sarasota, Jarred Wilson. We started working on a survey to pose to archivists and museum professionals to try to figure out what data people had and how we could aggregate that into a generalized form that would be useful for budgeting and planning future processing projects. As is the focus of our talk here, this issue is becoming more and more common and we all thought these sorts of metrics would prove useful to others in the future.

slide06In our first meeting we spent a lot of time deciding how to collect the data and also discussing terminology. Having a group with mixed archival and museum backgrounds led to discussions of what exactly each of us meant when we said accessioning, processing, inventorying, and other such terms. Where I say process, Jessica may say accession. Where I say minimal record, she may say inventory entry. Further research and discussions showed that even within one segment of the community, these terms didn’t describe the same tasks for everyone. So, we began to think that we should survey people about terminology before surveying them about data – to make sure we asked the right questions.

When we next met, we went over the survey I had devised to try to get a grip on the terminology questions – but it was still confusing and not actually getting at the point we were after. And we also knew that surveys tend to have a small response rate and we didn’t want to over-burden the people that might participate in this project. So, back to the drawing board we went. We decided instead of asking people what they meant by each term and then asking how much time they spent doing the tasks described, we would cut to the chase and describe the actions we meant and see if they had data they could share or if they would agree to collect some data and send it to us.

slide07I sent out a general email asking for people who might be interested in taking part in a research survey regarding artifact processing within archival settings. From that first request, I received 31 responses from people interested in taking part in or learning more about the project. Then, once we had sorted out exactly what to ask for and how to format the data, I sent another, more specific request to just the people that had initially responded. After sending out that request, a number of people dropped out, and in the end only 6 people submitted data.

slide08But within those 6 institutions (7 when we include UF) were a wide variety of institution- and record-types – including archivists, curators, and managers from academic institutions, museums, federal and city government, and public libraries.

slide09As for the data, we had devised a set of 9 categories of artifacts that grouped different sorts of items together based on size or complexity, and a general idea of how long they would take to describe. Of the institutions who participated, 4 used these categories to collect data, while the other 2 sent in more generalized information from how they normally collect or devise processing times. At UF, we did a bit more processing of the artifacts with these categories in mind since metrics were not collected during the first 2 years of the project and having our own data involved seemed like a good idea.

slide10Here you can see the data parsed out by the categories showing the average amount of time for either minimal or full processing for the assorted 9 categories. The entries marked “null” meant that no data was received in that category for that level of processing. (And you may notice that one outlier in category 3 where each item took almost 3 hours to process. Those were some pretty intense dioramas that skewed the data wildly for that category, but it doesn’t have much of an impact on the final averages.)

slide11 Here you can see the average overall processing times in a few different ways. Processing time for the categorized items comes in at around 8 minutes per item for minimal processing and almost 22 minutes per item for full processing. All of the categorized processing averages out to just over 19 minutes per item. When I couple this data with the numbers from the other 2 institutions that only sent in generalized data, you can see that the final number only goes up by about 20 seconds in the end. So what we have in the end is that, with or without the categories, an artifact can generally be expected to take roughly 20 minutes to process (I had estimated 10 minutes in my processing plan). This is an aggregate, so obviously the processing times of individual items will vary dramatically. But for large collections of objects, knowing that you have, say, 2,000 or so items to process, at roughly 20 minutes per item, allows an institution to at least propose a relatively reliable timeline (5 to 6 months) for project planning and budgeting.

I would like to see a larger data-set to create more useful guidelines for processors going forward, and we’re continuing to collect numbers at UF, but for now, this is what we have. Also, just a quick thanks to everyone who participated in this study.

slide12

National Tracing Center of the Bureau of Alcohol, Tobacco, Firearms, and Explosives

by Steve Ammidown* and Steve Duckworth (original post appeared on the SAA I&A Roundtable “Archivists on the Issues” blog on July 7, 2016; this post updated on September 19, 2016 to include new information and articles)

Let’s talk about guns. And records management. And maybe some advocacy.

According to their website, the Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF) is

a law enforcement agency … that protects our communities from violent criminals, criminal organizations, the illegal use and trafficking of firearms, the illegal use and storage of explosives, acts of arson and bombings, acts of terrorism, and the illegal diversion of alcohol and tobacco products. [They] partner with communities, industries, law enforcement, and public safety agencies to safeguard the public [they] serve through information sharing, training, research, and use of technology.

That’s a big job. But as we’re about to see, “information sharing” and the “use of technology” are pretty restricted at the ATF.

The National Tracing Center (NTC) is the firearms tracing facility of the ATF. Located in Martinsburg, WV, they are the only facility in the United States that can provide information to law enforcement agencies (local, state, federal, and international) that can be used to trace firearms in criminal investigations, gun trafficking, and other movement of firearms, both domestically and internationally. And they are drowning in records. From recent reports, roughly 1.6 million records arrive at the facility each month. Records usually come from defunct firearms dealers who are required to submit their records when they go out of business. (For dealers still in business, the NTC contacts them after tracing a weapon through its manufacturer.) There appear to be no standards in place for how dealers have to keep or submit these records. There is a form (4473) for the actual gun purchase, but other records can come in on computer media or hand-written documents. They often arrive somewhat damaged, with partially shredded or water-damaged records being frequently cited in news reports. Some people even send theirs in on rolls of toilet paper.

As it is currently illegal to create a registry of firearms in the U.S., the idea of a searchable database is also off the table. This leaves workers at the facility with the task of sifting through these records manually to complete traces. Upwards of 365,000 traces are requested each year, and the number will just keep growing (due in part to the Obama administration’s requirement that every gun involved in a crime be traced). While records are now being digitized to provide some easier access and relief for the physical space needed, the records remain non-searchable and amount to a newer version of microfilmed records. Even these digitization efforts are problematic, as a recent Government Accountability Office report showed. The GAO reported that digital records systems were in violation because, among other things, the records were kept on a single server, and allowed access to too much data.

Some of the problems at the NTC can be traced to the lack of consistent leadership and chronic underfunding. The position of the agency director was unfilled from 2006 to 2013 due to legislation backed by the National Rifle Association (NRA) that requires Senate confirmation to fill the position. In 2013, acting director B. Todd Jones was narrowly confirmed to fill the Director role, but he retired soon after (in 2015) and the position has yet to be filled permanently. Stagnant funding has prevented the ATF from keeping up with demand. In addition to the overwhelmed workers at NTC, the agency has just over 600 inspectors dedicated to inspecting the record-keeping at over 140,000 gun dealers across the country.

The data collected by the NTC is also subject to the NRA’s legislative sway in Washington. As mentioned, they have been successful in heading off any attempts at creating a searchable database, arguing that such a mechanism would be a “registry” in violation of the Second Amendment. Taking it one step further, a set of provisions known as the “Tiahrt Amendments” has been attached to every U.S. Department of Justice appropriations bill since 2003, prohibiting the NTC from releasing information to anyone other than a law enforcement agency or prosecutor in connection with a criminal investigation. The law effectively blocks this data from being used in academic research on criminal gun use or in civil lawsuits against gun sellers or manufacturers. It also prevents the ATF from collecting the inventory information from gun dealers, which would further help identify lost or stolen guns. The Law Center to Prevent Gun Violence argues that these amendments only empower criminals and reckless dealers.

Restrictive laws and a lack of quality management lead to a massive backlog of records and a very limited system of filling the great number of trace requests the NTC receives. The antiquated measures required by the NTC restrict law enforcement’s ability to perform their duties. While public opinion regarding gun sales seems to be turning (unlike Congress’s voting record), the idea of a database seems quite far off. An effective and permanent director at the ATF would be a good starting place, but as of this writing, a nomination doesn’t appear to even be in place (and given the current political climate, it’s not likely to happen anytime soon).

Not unlike the rest of the gun debate, the political debate around the NTC and gun tracing data seems intractable and unlikely to change. Luckily for us, however, we’re archivists and records managers! We offer a unique perspective on this subject when we contact our elected officials on this topic. We’ve been in the dusty stacks (yes, we said it) and dealt with unwieldy access systems when time was of the essence. We should be arguing for the modernization and full funding of the NTC and the repeal of the Tiahrt Amendments, at the least to improve access to government information and at the most to help save lives. So consider this your call to advocacy (as mentioned at the start). If you feel this situation warrants some action, take it and contact your legislators now!

*Steve Ammidown is the Manuscripts and Outreach Archivist for the Browne Popular Culture Library at Bowling Green State University in Bowling Green, Ohio.

Sources:

Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF), https://www.atf.gov

Government Accountability Office, June 2016 “Report to Congressional Requesters: ATF Did Not Always Comply with the Appropriations Act Restriction and Should Better Adhere to Its Policies” (GAO-16-552), http://www.gao.gov/assets/680/678091.pdf.

Law Center to Prevent Gun Violence, “Maintaining Records on Gun Sales,” http://smartgunlaws.org/gun-laws/policy-areas/gun-dealer-sales/maintaining-records-on-gun-sales/

National Tracing Center (NTC), https://www.atf.gov/firearms/national-tracing-center, (informational brochure: https://www.atf.gov/firearms/docs/national-tracing-center-information-industry-members-atf-p-331210/download)

National Tracing Center via Wikipedia (good general overview), https://en.wikipedia.org/wiki/National_Tracing_Center

The White House, “Presidential Memorandum – Tracing of Firearms in Connection with Criminal Investigations,” https://www.whitehouse.gov/the-press-office/2013/01/16/presidential-memorandum-tracing-firearms-connection-criminal-investigati

News reports:

2016 August 30, GQ, “Inside the Federal Bureau of Way Too Many Guns,” http://www.gq.com/story/inside-federal-bureau-of-way-too-many-guns?mbid=social_facebook

2016 August 24, The Trace, “The ATF’s Nonsensical Non-Searchable Gun Database, Explained,” https://www.thetrace.org/2016/08/atf-ridiculous-non-searchable-databases-explained/

2016 March 24, America’s 1st Freedom [NRA magazine], “Where the ATF Scans Gun Sales Records,” https://www.americas1stfreedom.org/articles/2016/3/24/where-the-atf-scans-gun-sales-records/

2016 January 6, The Guardian, “Agency tasked with enforcing Obama’s gun control measures has been gutted,” https://www.theguardian.com/us-news/2016/jan/06/bureau-alcohol-tobacco-firearms-obama-gun-control-measures-funding-understaffing

2015 October 27, USA Today, “Millions of Firearms Records Languish at National Tracing Center” http://www.usatoday.com/story/news/nation/2015/10/27/firearms-national-tracing-center-atf/74401060/

2015 March 20, USA Today, “ATF director announces resignation,” http://www.usatoday.com/story/news/nation/2015/03/20/atf-director-b-todd-jones-resigns/25081713/

2013 June 11, Media Matters, “How the NRA Hinders the ATF Director Confirmation Process,” http://mediamatters.org/research/2013/06/11/how-the-nra-hinders-the-atf-director-confirmati/194412

2013 May 20, NPR, “The Low-Tech Way Guns Get Traced,” http://www.npr.org/2013/05/20/185530763/the-low-tech-way-guns-get-traced

2013 March 13, InformationWeek, “ATF’s Gun Tracing System is a Dud,” http://www.informationweek.com/applications/atfs-gun-tracing-system-is-a-dud/d/d-id/1109062

2013 February 19, WJLA ABC7 (Washington, D.C.), “ATF National Tracing Center Traces Guns the Old-Fashioned Way,” http://wjla.com/news/nation-world/atf-national-tracing-center-traces-guns-the-old-fashioned-way-85417 (YouTube: https://www.youtube.com/watch?v=4lFdLaYcDNQ)

2013 January 30, CBS Evening News, “Tracing Guns is Low-tech Operation for ATF,” http://www.cbsnews.com/news/tracing-guns-is-low-tech-operation-for-atf/

2011 November 2, Law Center to Prevent Gun Violence, “Federal Law on Tiahrt Amendments,” http://smartgunlaws.org/federal-law-on-tiahrt-amendments/

2010 October 26, Washington Post, “ATF’s Oversight Limited in Face of Gun Lobby,” http://www.washingtonpost.com/wp-dyn/content/article/2010/10/25/AR2010102505823.html

Research Post: Fire at the Cinemateca Brasileira

This post first appeared on the Society of American Archivists’ Issues & Advocacy Roundtable blog.

I&A Research Teams are groups of dedicated volunteers who monitor breaking news and delve into ongoing topics affecting archives and the archival profession. Under the leadership of the I&A Steering Committee, the Research Teams compile their findings into Research Posts for the I&A blog. Each Research Post offers a summary and coverage of an issue. This Research Post comes from On-Call Research Team #2, which is mobilized to investigate issues as they arise.

Please be aware that the sources cited have not been vetted and do not indicate an official stance of SAA or the Issues and Advocacy Roundtable.

Summary of the Issue

A fire broke out in the film library of the Cinemateca Brasileira in São Paolo on February 3, 2016. The exact cause of the fire was not reported, but the area involved was where nitrate film was stored. This material is known to be volatile and can spontaneously combust due to environmental factors. Sources reported that approximately 1,000 rolls of film burned in the fire. All is not lost, however, as the institution states that all films lost in the fire had been preserved in other media formats (though some of the reports stated that 80% were preserved in other formats). Reports of the fire came out soon after the event occurred, but updates and further information has not been located. While there are many reports, especially reports in Portuguese, almost all of them date from February 3 or 4. They each appear to leave some questions on the table.

The fire occurred in one of the institution’s nitrate film warehouses, which are specially designed to house such film; there is no electric grid and interior walls do not reach the ceilings. Most sources report that it took about 30 minutes to contain the fire. Some video footage of the scene can be found here.

The Cinemateca Brasileira holds some 250,000 film rolls, including features, short films, and newsreels, as well as books, papers, movie posters, and other paper records; this loss represents 0.4% of their film holdings. The history of the Cinemateca can be traced back to 1946 as the Second Film Club of São Paolo (after the First had been closed by the Department of Press and Propaganda in 1941). In 1948, the Club became affiliated with the International Federation of Film Clubs and, in 1949, with the film department of São Paolo’s newly created Museum of Modern Art. In 1964, it was incorporated into the Ministry of Culture, becoming a governmental institution. Previous fires have occurred in 1957, 1969, and 1982, all due to nitrate film. The institute moved into its current facilities, built under the technical guidance of the International Federation of Film Archives (FIAF), in 1998.

The Archivist Rising blog reported that the institution suffered somewhat recent budget cuts due to a large financial crisis. Blogger Aurélio Michiles blames the incident on the previous budget cuts as well, but describes the cuts as more of a punishment towards the administration rather than having to do with an overall financial crisis. Further sources state the number of employees has been reduced from over 100 in 2013 to just over 20 currently, though it remains unclear how many employees are governmental workers and how many are actually employed by the Cinematheque’s Friends Society (Sociedade Amigos da Cinemateca) and whether or not that affects the various numbers reported from different sources. The truth behind this budget controversy is left for further research – and preferably by someone proficient in Portuguese.

bibliography of coverage of the issue:

“Some thousand film rolls burnt in Cinemateca Brasileira fire.” EBC Agencia Brasil. Accessed 2016 March 25. http://agenciabrasil.ebc.com.br/en/cultura/noticia/2016-02/some-thousand-film-rolls-burnt-cinemateca-brasileira-fire (EBC manages TV Brasil, TV Brasil International, Agência Brasil, Radioagência, and the National Public Broadcast System. Besides the commitment to public communication, their values are characterized by editorial independence, transparency, and participatory management.)

“Cinemateca Brasileira.” Wikipedia. Accessed 2016 March 25. https://en.wikipedia.org/wiki/Cinemateca_Brasileira

“Ministério da Cultura não tem plano para evitar novos incêndios na Cinemateca Brasileira.” Estãdo. Accessed 2016 March 25. http://cultura.estadao.com.br/noticias/cinema,ministerio-da-cultura-nao-tem-plano-para-evitar-novos-incendios-na-cinemateca-brasileira,10000014848

“Ministério da Cultura mudará gestão da Cinemateca.” Folha De S. Paolo. Accessed 2016 March 25. http://www1.folha.uol.com.br/ilustrada/1220662-ministerio-da-cultura-mudara-gestao-da-cinemateca.shtml

The I&A Steering Committee would like to thank Steve Duckworth for writing this post, and Rachel Seale and Alison Stankrauff for doing key research on the issue.

I&A On-Call Research Team #2 is:

Alison Stankrauff, Leader
Katherine Barbera
Anna Chen
Steven Duckworth
David McAllister
Rachel Seale

If you are aware of an issue that might benefit from a Research Post, please get in touch with us: archivesissues@gmail.com.

Code4Lib2016 Conference Review

This post was written for, and first appeared on, SAA’s SNAP roundtable blog.

Code4Lib 2016 was held in Philadelphia, PA from March 7 to 10 along with a day of pre-conference workshops. The core Code4Lib community consists of “developers and technologists for libraries, museums, and archives who have a strong commitment to open technologies,” but they are quite open and welcoming to any tangentially related person or institution. As a processing archivist whose main experience has been with paper documents, I thought I would feel confused and out of place for the length of this conference, but, while I had my moments, I left feeling more knowledgeable about efforts and innovations within the coding community, giddy with ideas of projects to bring to my own workplace, and incredibly glad that I stepped outside of my archival comfort zone to attend (and present at!) this conference. (And I have to thank our university’s Metadata Librarian, Allison Jai O’Dell, for asking me to present with her. Without her reaching out to me, I likely wouldn’t have gotten involved in the conference to begin with.)

So, before Code4Lib, there was Code4Arc – at least, as a preconference workshop. Code4Arc focused on the specific coding and technology needs of the archivist community and on the need to make Code4Arc an actual thing, rather than just an attachment to Code4Lib. While both communities would have quite a bit of overlap, archivists obviously have their own niche problems, and coders can often help sort those problems out. Also, having a direct line between consumer-with-a-problem and developer-with-a-solution would prove quite beneficial to all parties involved. The day was divided up into a series of informal discussions and more focused breakout groups, along with some updates from developers. The end result mainly boiled down to continuing the discussion about our needs as a community, communicating and sharing knowledge and data more openly, and focusing efforts on specific problems that affect many archives. We’ve formed some ad hoc groups and will likely have more to say in the not-too-distant future.

code-loveAs to the conference proper, I’ll start by noting that a ton of information is available online. The conference site lists presentations, presenter bios, and links to twitter handles and slides where available. Three series of Lightning Talks emerged during the conference; information on those talks can be found on the wiki, which is full of useful information and links. And everything was recorded, so you can watch the presentations from the Code4Lib YouTube channel. The conference presentations were almost a series of lightning talks themselves. Each presentation was allotted 10-20 minutes of time, with 6 groups of presentations given over the course of the conference, along with 2 plenary talks. So, while it was a nice change from the general conference configuration, it did make for a rather exhausting (but engaging) experience. Having said that, I will only specifically mention a few of the presentations that resounded more with me or relate more specifically to archival work (because seriously, I saw over 50 in the course of 2.5 days). But again, I stress, totally worth it! And they feed you. A lot!

So on day one (inserts shameless plug), Allison Jai O’Dell and I presented The Fancy Finding Aid (video | slides). We talked about some front-end design solutions for making finding aids more interactive and attractive. Allison is wicked smart and also offered up a quick lightning talk on day three about the importance of communicating, often informally, with your co-workers (video). Other presentations of note from day one include Shira Peltzman, Alice Sara Prael, and Julie Swierzek speaking about digital preservation in the real world in two separate presentations, “Good Enough” preservation (video | slides) and Preservation 101 (video | slides). Eka Grguric broke down some simple steps anyone can take towards Usability Testing (video | slides) and Katherine Lynch shared great ideas regarding Web Accessibility issues (video | slides). Check out the slides for lots of great links and starting points, like testing out navigability by displacing your mouse or using a screen reader with your monitor off.

Matienzo: Ever to Excel

Later on, Mark Matienzo discussed the ubiquity of the spreadsheet in Ever to Excel (video | slides). The popularity of spreadsheets may come from the hidden framework that shields users from low-level programming, making users feel more empowered. Lightning talks included the programming committee asking for help with diversity in #ProgramSoWhite (video | slides), a focus repeated the following day in a diversity breakout session. Ideas generated from the diversity talks were focused on further outreach with schools and professional organizations, scholarship initiatives for underrepresented populations and newer professionals, and stressing the need for those in the coding community to reach out for collaborators in other areas to bring new voices into the community.

Angela Galvan, in her talk titled “So you’re going to die” (video | related notes), spoke about digital estate management and the need to plan for what happens to digital assets after someone dies. Though humans now post so much of their lives online, we are still relatively silent about death. Yuka Egusa’s talk about how non-coders can contribute to open source software projects was particularly popular (video | slides). She notes that engineers love coding, but generally don’t like writing documentation. Librarians and archivists can write those documents and training manuals, and we can aid with reporting bugs and usability testing. Don’t let lack of coding knowledge keep you from being part of innovative programs that interest you.

Yoose: libtech burnoutDay two: Becky Yoose gave an exhilarating talk about protecting yourself from #libtech burnout (video | slides). In the lightning talks, Greg Wiedeman spoke about his Archives Network Transfer System (video | more info), which is an interesting solution to a problem Code4Arc focused on, but also highlights the need for a simpler way to structure the process of transferring digital materials to the archives.

Andreas Orphanides gave a great talk about the power of design. Architecture is Politics (video | slides) highlighted how, intentionally or not, your web and systems designs are political; likewise politics influence your design. The choices you make in design can control your user, both explicitly and subtly, and politics can influence the choices you make in the same way. Thus, design is a social justice issue and you need to be active in knowing your users, recognizing your own biases, and diversifying your practices. Matt Carruthers talked about Utilizing digital scholarship to foster new research in Special Collections (video | slides). This project at the University of Michigan provides visualization-on-demand customized to a patron’s research question. Though still in the early stages of development, they are extracting data from EAD files they already have to create EAC-CPF connections. This data is then used to visualize the networks, and online access to the visualizations is offered for users. This is the start of a fascinating new way to provide further discovery and access in archives and special collections.

Day three’s lightning talks included Sean Aery from Duke speaking about integration of digital collections and findings aids and some great ways to maintain context while doing so (video | slides); Heidi Tebbe recommended the use of GitHub as a knowledge base, not just a place for code (video | slides); and Steelsen Smith pointed out the various issues that can arise with assorted sign-ons and using single sign-ons to actually open up systems for more users (video | slides).

And lastly, Mike Shallcross discussed a University of Michigan project that I’ve been following closely, the ArchivesSpace-Archivematica-DSpace Workflow Integration (video | slides). They are working to overhaul archival management to bring ArchivesSpace and Archivematica together with a DSpace repository to standardize description and create a “curation ecosystem.” We’re closing in on a similar project where I work and Mike has been making regular (and rather entertaining) blogposts about the Michigan project, so it was good to hear him in person. (If interested in more, check out their blog.)

Orphanides: Architecture is PoliticsOh, the plenary talks. I almost forgot. They were great. The opening talk by Kate Krauss of the Tor project focused on social justice movements in the age of online surveillance (video | slides) and the closing talk by DuckDuckGo founder Gabriel Weinberg (video) similarly focused on privacy and related concerns in online searching.

So, it was a great conference. There were definite themes emerging about creating better access and more privacy for users; trying to get out of your normal routine and envision projects from another perspective; communicating better and more openly within and around our own community; and using all of this to better document and support underrepresented communities around the world. I’ve now said too much. I hate reading long blog posts. But I definitely recommend this conference to anyone in the library and archives fields with any inkling of interest in digital projects. It’s a great way to get new ideas, see that you aren’t alone with your out-of-date systems, and meet some great people who you may not normally get to interact with on a regular basis.

code4lib 2016: The Fancy Finding Aid

Fancy Finding Aid – PowerPoint slides (PDF format, 1 MB)Fancy Finding Aid PowerPoint files

Video of presentation:

Reaction on twitter:

Shuck it!

I was recently reminded of the usefulness of the oyster shucker as archival implement. Yes, that’s right, the humble oyster shucker.

shucker

It’s not something I ever envisioned using in the archives. It’s not something I took courses on in library school. It was never mentioned during my internships. But it is a tool I feel we all need to be a bit more educated about. Is has a multitude of uses, and it’s just kind of fun to say.

It expertly assists with:

unbinding
staple-pry
hold

defense

and shucking oysters – of course
(though perhaps don’t use the same one you use in the archives)Oyster shuckers at Apalachicola, Fla. This work is carried on by many young boys during the busy seasons. This is a... - NARA - 523162

So, here’s to the wonderful and versatile oyster shucker! And thanks to the National Park Service in Anchorage for bringing this shucker into my life.

Metrics for hybrid collections

Archival repositories are, more and more, finding themselves in the position of processing and housing hybrid collections of standard archival documents and what would generally be referred to as artifacts or museum-style objects. While archivists have become quite adept at tracking timing statistics for processing paper-based collections, few similar, timing-based metrics appear to exist, at least in professional literature, to aid in planning for processing of these hybrid collections.

Our group of archivists and museum professionals are interested in closing this gap in metrics for the information science community as a whole. If you or your institution currently keep metrics on such holdings, and would be willing to participate in a future survey regarding your collecting and processing habits, please email me at steveduckworth@ufl.edu. I would appreciate hearing from you by February 29, 2016.

Please feel free to share this message with other interested parties. Thank you.

Steve Duckworth | Processing Archivist
Department of Special & Area Studies Collections
George A. Smathers Libraries
University of Florida
200A Smathers Library
352.273.2655

On Chronological Order

I’m just going to put this out into the world, but in a place I can easily find it again. Dating archival material can get rather specific. Frequently, materials are then filed in chronological order and those specific and extensive dates can get a bit tricky to organize. So, with former colleagues from PACSCL (h/t to Annalise Berdini, among others), we’ve come up with this example list of what order these things go in, from most specific (or first) to least specific (or last). I find that I refer to it quite frequently when doing physical arrangement. Perhaps one day it will become ingrained in me and I can stop checking this list, but until that day comes, and perhaps as a small benefit to other archives processors out there, the list is as follows:

1931 January 1
1931 January
1931
circa 1931
1931, 1945
1931, 1945-1946
1931, undated
1931-1932
undated

One can then use this established sequence for further expansion. For example, something like “circa 1931, 1945” would go after  “1931, 1945”  and so on. I still sometimes have my qualms about the position of entries with circa dates, but I think this list is accurate and helpful. So, that’s it. Go put things in order!

From Student to Professional

This post originally appeared on the SAA Students and New Archives Professionals Roundtable Blog in their “Transitions” series, “which highlights the experiences of recent graduates and early career archivists.”

I graduated from the MSLIS program at Drexel University, with a concentration in Archives, in December of 2013. About 6 months later, I found myself gainfully employed (although temporarily) as a Project Archivist with the National Park Service in Anchorage, Alaska. My move from student to new professional, while rife with the standard issues we all face, was also compounded with issues of moving roughly 4,000 miles away from almost everyone I know and all the support structures I had built for myself over the years. Luckily, the education and preparatory experience I had in Philadelphia gave me a solid foundation to succeed in this new adventure.

I came to the archives field late in life. In my “youth,” I earned degrees in music performance and spent much of my 20s and early 30s working as a freelance cellist in various locales. As I begin to question the future and what I could see myself happily doing for the rest of my life, but in a more stable environment, I began to look at Library Science as a perfect option.

Fast forwarding to about three months before graduation, I had just begun work as an Archives Processor with the Philadelphia Area Consortium of Special Collections Libraries (PACSCL) Hidden Collections project. This was the perfect environment to learn real-world processing skills and put all the theoretical knowledge I had learned at school to work. In this collaborative environment, working with other students near graduation and under the direction of a remarkable supervisor and mentor, we all gained skills that will serve us well throughout our professional careers. Taking that position was a leap for me. I left a full-time job with great benefits for a part-time job with none in the hopes that it would help pave the road for a career in my new profession. It turned out to be one of the best choices in my life. If you are a student currently, I urge you to seek out situations where you can work with people that will teach and challenge you wherever possible. Attaining the degree alone is not enough.

Thanks in large part to the work with PACSCL, I was hired for my position in Alaska. This was another huge leap that I felt I had to take. It was a great position that would give me further experience in the field and it was a bit of an adventure. As a musician and a rather independent person, I’d become somewhat of a gypsy – moving from place to place and never really feeling like staying put, so I thought, “What the [heck],” and said yes.

My position in Anchorage is basically a processing archivist. I’m working through all the records (close to 200 linear feet) of the largest national park in America. I’m using the principles of minimal[i] and maximal[ii] processing to streamline the way the park service here has traditionally cared for its paper records. Thanks to these more efficient processing ideals, I’ve been able to process all of the park’s records rather than roughly half of them, as was originally proposed. Additionally, I’ve also gotten to do some accessioning, budgeting, forecasting, and reference work while here. Working as somewhat of a “lone arranger” has also allowed me to take more responsibility for arrangement and description decisions, given me more project management experience, and increased the trust I have in my own instincts. So, all-in-all, the experience I’ve gained has been wonderful.

There are, of course, some downsides. Though I work with others in the Cultural Resources department, I am the only archivist on staff here. My supervisor is very knowledgeable and helpful, but for certain issues of processing and preservation, I generally find myself turning to colleagues from Philadelphia and beyond. By maintaining those relationships, which are now a comfortable blend of professional and personal, I can reach out for advice and also to share interesting and humorous finds.

Another downside I’ve noticed stems from the temporary nature of my position. Knowing that my time here is limited has created certain social restraints. It’s difficult to invest too much in a place when you know from the start that you won’t be around all that long. This has lead to some isolation, both socially and professionally. I’ve also come to see that these project positions aren’t just hard on the archivist; they’re hard on the institution too. After working through 200 feet of documents, I feel I’m just getting a somewhat-solid grasp on the inner workings of the National Park Service. And with each state or region and each park having its own unique issues, moving on to the next collection would only add to my understanding. However, I’m not going to be adding on; I’m going to be moving on. I’ll move on to a new position with new issues and, when funding becomes available, the park service will have to find another archivist for their next project who will have to go through all of this learning and adjustment again. This story could go similarly for any institution. I understand the financial reasons behind the system we’ve created here – project funding, grants, etc. – however, I think the drawbacks from that system may be pricier than it seems at first glance. But I digress.

A poster[iii] I saw at the 2014 SAA meeting showed that it generally takes 6 to 12 months for people to assimilate into a new location and I can attest that it has taken me about 6 months to start feeling like I have a small support base of friends here, and soon I’ll be moving on. While the work I’ve done here has been fulfilling, and I’ve seen some amazing things in Alaska, I am now of the mindset that my next position needs to be permanent, or at least in a place where I plan on spending a significant portion of the rest of my life. Everyone handles these changes differently, and Alaska is obviously more remote than most temporary positions will take you, but keep this idea in mind as you look for your first professional position and be honest about how you’ll deal with feeling isolated for an extended period of time. For me, I think my gypsy days are numbered.

Be that as it may, the training I received from Drexel has served me well. I have a strong foundation of theories and principles on which to grow, I know where to look for further information on topics that arise, and I have a broad base of knowledge concerning general library and information system topics that support further growth in the field. However, and not to discount anything taught in library science programs (even with the issues we know they all have), the training I received through internships and entry-level positions during and just after graduate school, have given me the most help for transitioning into a professional role. That training and experience, coupled with the professional contacts and connections I’ve made, is more far-reaching than anything I learned in a classroom. My advice to all students and new professionals is to make and cultivate professional connections in your own life and to take calculated risks when they arise.

 

[i] Mark A. Greene and Dennis Meissner, “More Product, Less Process: Revamping Traditional Archival Processing,” The American Archivist 68, no. 2 (2005): 208-263.

[ii] Robert Cox, “Maximal Processing, or, Archivist on a Pale Horse,” Journal of Archival Organization 8, no. 2 (2010): 134-148.

[iii] Wendy Cole, Steven Wade, Karen Dafoe, and Victoria Hess (Louisiana State University SAA Student Chapter), “Leaving Home: Taking a Job Outside Your Comfort Zone” (poster at the Annual Meeting of the Society of American Archivists, Washington, DC, August 10-16, 2014).

 

Step Two: Process All the Data

Where was I? Oh yes, in my last processing-related post, I had just finished the physical processing of my collection. I’ve now finished all of the folder title data entry and am in the final stages of editing scope notes and other descriptive text. (Side note: I probably need a “Step One-Point-Five: Arrange All the Records” post, but frankly, it’s not that intriguing. “Put things in order.” Done! Back to step two.)

Excel and the joys of XML (which I hardly understand myself) saved me a bunch of time in this process. This collection has 534 boxes (plus some flat files and records not physically in my possession), with 4,631 folder titles. And I spent about 50 hours typing them all into the computer. So far, I’ve also spent about 35 hours editing that data and the scope notes for the collection.

The actual final (post-arrangement and error checking) numbers on the collection size are this:

  • Original collection size: 220 linear feet
  • Final collection size: 184.38 linear feet (or 180.48 cubic feet) (about a 16% reduction in physical size)
Wrangell-St. Elias National park and Preserve
awaiting labels

Processing speed depends on how you look at it. Based on the original size of the collection (which is how I’ve always done this), I’m at just over 3 hours per linear foot. Very speedy. Based on the final size, I’m at 3.50 hours per linear foot or 3.66 hours per cubic foot. Still pretty speedy. Data entry averaged out to a bit over 3.5 feet per hour. And now I’m left with writing and editing narrative text for the finding aid and putting labels on things (post-its are not exactly kosher in archives-land), which is great because . . .

I’m moving to Florida in 2 weeks. I’ve accepted the position of “Processing Archivist” at the University of Florida and I’m very happy to be moving back close to home and taking a job which sounds interesting and has no attached end-date. I’ll be processing across all of their collections, so stay tuned for further tales of processing and, hopefully, humorous finds via Instagram/Twitter (see buttons above). While Alaska has been a unique experience, I’m overjoyed for this new opportunity and happy to be ahead of schedule and finishing up this project before leaving.