Archive | Articles RSS feed for this section

We Stand with Orlando

The 2016 Orlando nightclub shooting was a domestic U.S. terrorist attack that occurred on June 12, 2016, at Pulse, a gay bar and nightclub in Orlando, Florida.

The attack was one of the deadliest mass shootings in United States history, the deadliest incident of violence against LGBT people in United States history, and the deadliest attack on civilians in the U.S. since September 11, 2001.

Our sympathies and support go out to the LGBT community in Orlando and around the world.

We Stand with Orlando

 

 

Comments { 0 }

Planning for Copyright Permissions

Pocket Copyright Guide for Publishers This excerpt is from chapter three: Obtaining and Managing Copyright Permissions of Pocket Copyright Guide for Publishers. As copyright is the lifeblood of publishers, a basic knowledge of copyright law is crucial to working effectively with authors on such issues as transfers of copyright, terms of copyright, terminations and ownership.


Publishers deal with permissions to use copyrighted content on a daily basis. Chapter 3 addresses permissions to include portions of copyrighted books, articles, etc., in a publisher’s own works as well as granting permission to use works on which the publisher holds the copyright.

This chapter is broken down into tips on Obtaining Permissions as well as Granting Permissions.

OBTAINING PERMISSION

Including portions of others’ works in something that a publisher is producing likely requires permission in order to avoid infringement of the copyright. Recall, however, that not everything is subject to copyright protection such as facts, book titles, names of authors, etc. Further, not every use of a copyrighted work requires permission.

Who, What and When

Determining whom to contact for permission may be confusing since identifying who owns the rights to a copyrighted work is complicated. As a general rule, one is obligated to contact the rights holder, i.e., the individual or organization that owns the rights. For literary works, this is most often the publisher, but could be the author, the heirs of the author or an organization. A first step is to check the Copyright Office registration records.

Orphan Works

In the case of an orphan work, the copyright owner is either unknown or cannot be located. The 2006 U.S. Copyright Office Orphan Works study recommended that, after someone uses best practices to identify and locate the copyright holder but fails, if the person proceeds with the use of the work without permission, he/she would not be liable for damages.

Proof and Attribution

Attribution means including a statement that credits the author with writing the work. Failure to attribute may result in a plagiarism charge, plus it is not good business practice for a publisher to fail to credit an author for his/her work. Attribution does not substitute for permission to reproduce, however, and could even result in an infringement action.

Implied Licenses

Courts rely on the custom and practice of the community involved to determine the scope of an implied license. It is risky to rely on an implied license in a commercial setting rather than seeking permission.

GRANTING PERMISSION

Publishers also find themselves on the other side of the table when authors, other publishers, librarians and teachers approach them for permission to use portions of the works they have published. As a matter of business planning, publishers should develop clear policies on what types of permissions it will grant and which ones it will deny.

In-house Permissions Systems

To manage permissions in-house, publishers must dedicate staff time and effort to record permissions granted, handle royalties received, disburse the royalties to authors and others, and to ensure that similar requests are handled in the same manner. Today, computer systems can assist with this task, but still a system is required.

Use of a Royalty Collection Agency

Many publishers rely on the Copyright Clearance Center or iCopyright to manage copyright permissions for them. Both organizations have services for publishers that are discussed in Chapter 6.
 

Pocket Copyright Guide for Publishers by Laura N. Gasaway and edited by Iris Hanney contains information vital to the publishing community.

Learn more about how copyright law affects your work or order it now.

Comments { 0 }

Your Voice Matters

Reviewing the last 111 years, it is easy to check off how technology has reduced physical labor:

•    Cars replaced walking and horses
•    Planes went from dirigibles to propeller driven to jet engines
•    Automatic transmissions replaced manual stick shifts
•    Digital photography replaced film
•    Remote control boxes replaced television rotary dials

Use of voice as an input device also has evolved from the early days Speech to text programs have been around since the 1980s. Voice to text software was launched by Covox in 1982 for the growing personal computer industry with the IBM PC in the lead. Another company founded in 1982, Dragon Systems, continues to be the leader in the speech recognition. Scansoft, Inc. now owns and manufactures their well-known product, Dragon Naturally Speaking.

Voice recognition is not just for getting documents created. For example:

•    Cordless and cell phones introduced us to voice activated dialing
•    GPS mapping and directions equipment allow for voice commands
•    Cars have a growing number of voice activated requests
•    Appliances all over the house and the office are emerging for everyone

An article about the voice control evolution appeared in the December 07, 2011 issue of BusinessWeek. The information points to all of the rumors about Apple TV, Microsoft Xbox 360 game console and the growing number of electronics vendors – Samsung, LG, Sharp and Sony, etc., gearing up to move from button and touchpad controls to voice command and control.

One of the current salvos being launched is Apple including interactive voice recognition, SIRI with their new iPhone 4s. Asking about the weather or the stock market or directions is standard stuff. Any actions that one can do with finger touch are potential for SIRI. You can say ‘send a text message’, then say the recipient’s name from the contact list, confirm which phone number, dictate the message and send.

This is pretty basic stuff. By 2013, voice commands will be everywhere. Saying words distinctly helps with today’s voice input. Alabama born and raised speaks very differently than one from Maine. So the advances in technology will enable tone and inflection differences. After all, we can discern when someone is speaking with a “happy voice” or an “angry voice. There is at least one project underway that will detect a person’s mood by verbal cues.
Today’s Siri and Xbox voice control are growing in use. The expectations are that Apple’s TV set will have voice command; New Windows Operating Systems for PCs and Xboxes will have gesture and voice control; and Google will implement voice activated search beyond what is accessible now. It is also clear that Google TV will return.

The consumer electronics companies will promote interactive TV talk through voice-enabled apps for smartphones and tablets. Xfinity/Comcast already has a downloadable app that provides for customer programming of the DVR. At the TV, remote control functions can be issued through the smartphone’s internet connection. Comcast is testing the addition of voice-control features. LG, Sony, Panasonic, Toshiba, Samsung and Sharp will all test similar apps.

Each family member will be able to set their own Voice commands to program show recordings, change channels, access the web. OF course, there could be the battle of the voice controls that will have to be managed by some responsible person, such as an adult. Those individuals who are push-button phobic will have a some speaking issues as they learn how to talk what version of CSI the actually want to record. As with all technology advances; it can be anticipated that the transition to the new will be easier for some, harder for others.

Nuance, maker of the popular Dragon dictation software suites is what Apple has used for Siri as well. It appears that many manufacturers have turned to them to help transform remotes instead of eschew them. Nuance’s Thompson says TV, DVD, and set-top box makers are all working on models that look more like iPhones, some with touchscreens rather than that gaggle of unused buttons. Some of the prototypes are designed around a single prominent button that activates a microphone, he says. Cost will be a challenge, since such a device would need a microphone and Wi-Fi antenna instead of the infrared sensors now commonly used.

Nuance has estimated that 5% of TVs could be voice controlled by Christmas 2012. Of course, there are several problems to solve, such as which command takes preference, and how they would distinguish commands from normal conversation. But, there’s hope. SRI International, the company that worked on Siri before spinning it off into a separate company, has been working on solutions. They’ve been working on a project that can discern people’s moods by verbal cues, something that may potentially be used to differentiate commands.

Mike Thompson from Nuance Communications continues to say that interactive remote controls will have touchscreens rather than buttons. This is similar to the Logitech Harmony series of remotes. For Harmony, there are several different screens that change the action of the button that is pressed. 

Vlingo, an App maker, introduced voice Apps for smartphones late last year. They are expected to announce a voice recognition product for TVs at CES 2012.

We have all marveled at Dick Tracy’s wrist radio. The TV series, Knight Rider, was all about a car that could act better than its human star. Robots have been demonstrated that respond to voice commands and conversation. SIRI on the 4s is a real world demonstration of voice interaction. The key is the capability for human to speak and machine to hear the same thing. If you tell your automobile’s GPS mapping device that you want to go to Las Vegas; be sure that the directions take you to Nevada rather than New Mexico.

People are getting used to seeing others walking around talking to the air that surrounds them. These are people with a Bluetooth headset that is synced with a smartphone that is connected to cellular tower that sends the signals out into cyberspace. It is not just messages and conversations. Voice will be used to open the garage door, turn on the house lights and start the oven warming up to 400 degrees. It will be a novelty for upscale users only at the beginning. Prices will drop quickly and more will be in use by the end of this decade.

Video cameras are expanding along every city streets and intersection. As voice technology advances, we will have embedded microphones in our house, office, cars and all ‘smart’ devices. Devices will be listening to jump into action just as Captain Kirk expressed his commands starting with the phrase; “Computer …”

Comments { 0 }

How Intelligent is Your Content?

An interview with Ann Rockley

Written by Richard Oppenheim for
Unlimited Priorities and DCLnews Blog

Intelligence has an increasing list of definitions. There is natural, artificial, computer, along with many variations of intelligent as a descriptor – an intelligent question, comment, reply, etc. With the silos of data overflowing and new silo construction happening every day, the evolution of your data into functional content is a key component of intelligent analysis and results.

People and animals have collected, stored, and preserved items throughout history. On the people side, scrolls, books, art work and all things collectible were brought to a central location for protection or hording or just to allow others to view the items. Storage facilities were constructed way before the invention of electricity and the advent of digital data. Today, content is flowing through the conversion of many things to many things digital. There is no indication a pizza will evolve to something digital. You can order, pay for and request delivery of the pizza. Eating it is a different experience. Digitized content can be made accessible for anyone to view whether it is a book, movie, museum masterpiece or do it yourself images and journals.

Transforming content into intelligent content takes more than a magic wand and a few wishes. The content needs to be accessible. Once accessed, the enterprise must construct a capability to assemble various forms of content into usable information.

To shed a bright light on how content can be stored intelligently; I interviewed Ann Rockley, Founder and President of The Rockley Group. For more than 20 years, Rockley has been helping organizations and publishers of all sizes with a well-planned move to useful and usable content publishing strategies through the the use of tagging schemes, such as XML. The flood of content is exploding from every direction. The volume of content is advancing in a steady and forever increasing speed. This growth places strenuous demands within every enterprise whether for profit, not-for-profit or government agency, to create, manage, distribute all forms of content. Ann Rockley states, “We can do so much more than just full-text searching. We’ve gone from documents which are ‘black boxes’ to content which is structurally rich and semantically aware, and is therefore automatically discoverable, reusable, reconfigurable and adaptable.”

Intelligent content is not just about bigger, faster computer processing. In the last century, we worried about how to store all the paper that was being created. Large companies would buy or build large warehouses with cabinets and shelves to hold the documents that were being created non-stop. Cries of “paperless office” echoed from Wall Street to Main Street.

The computer did help, by creating even more paper to be stored. Accessibility to content is continuously expanding whether through public search engines or private company search applications. Today, companies of every size need to determine how it will store and access content. The right strategy is not about technology alone, it is about “…defining a content experience for your customer that enables them to achieve their goals anywhere, anytime and on any device,” says Ann.

The first step in this process is to understand overall company requirements. The practical issues include figuring out how to integrate the significant volume of already stored data with new data flowing through the input pipeline every second. Digital data needs to be stored with appropriate identification that it can be accessed. With estimates of data creation being measured in zettabytes, each organization will contribute its share to this volume. The good news about evolving technologies is that huge storage facilities are being strategically located around the world with sufficient power, cooling and security. One content area is linked with one or multiple content areas so that overflow, malfunction and other operating requirements can be shifted among the silos as needed.

The business demand for loads of storage is not just a volume measured in gigabytes or terabytes or some other huge number. The key with today’s digital data is that volume requirements fluctuate between peaks and valleys so that if a flood of new data knocks on the warehouse door, more storage space can be provided. This is called scalability. Retail stores experience this flood of more during the end of year holidays. Accountants have this experience during tax season. Ski resorts, Sunbelt states, summer vacations all have these variable data flows.

Ann Rockley advises everyone to recognize just how important it is to have each company build a detailed content strategy. Whether the company is growing or holding steady, tagging, storage, security and retrieval of content is crucial. She states that, “With today’s web based access technologies, computer use is becoming easy and in many cases, even easy to use. With more people gaining access to content, there are many more opportunities for collaboration throughout the personal or business communities.” As long as the computer platforms are constructed correctly, content can expand to whatever level of intelligence that is needed at the moment.

Having a structure for the content does not imply that every bit of data has the same format or application process. There are accounting data, reports, correspondence, manufacturing process control, inventory management and on and on. Data arrives and can be reshaped, recolored and tagged with appropriate XML style coding to create intelligent content. Developing a content strategy starts with knowing and/or learning a few things:

  • What data is currently being collected and where it is being used
  • What people are accessing the content – customers, employees, researchers, etc
  • How can existing content be merged with new structures being created
  • What is needed to enable scalability of content storage areas
  • As data is collected, does the process know the frequency and purpose of individual use
  • Does the content flow through the company work processes in a logical series of steps
  • How will the company establish and maintain appropriate taxonomy definitions
  • How will the company manage the stored content and its accessibility

Development strategies do not begin with a single ‘Aha’ moment. Strategy takes resources, review, input from multiple sources, and creating a structured blueprint for the years ahead. The strategy must have flexibility so that it can be adapted to the potential changes of business operations going forward. Redoing the strategy every year is not just expensive, it can likely be confusing, extremely difficult to complete the change in 12 months and can very likely undo any intelligence slowed or stopped from too many errors resulting from constant change.

The intelligent content structure has to support the capability for individual data components to be tagged so that data can be transformed to content then transformed to information. In addition, systems and procedures have to be implemented that prevent damage from such events as simultaneous updates to individual records.

There is so much more that intelligent content will provide to the organization. There will be faster response time to content questions, improved use of resources, and an increased satisfaction for all current users and the expanding base of future users. In early search days, we used the phrase data mining to locate and retrieve nuggets of data. Mining has matured and companies can now do content mining that provides a lot more nuggets along with the information that can be determined by viewing all of the collected nuggets as a whole.

As Ann Rockley says:

If we have a structure in our content we can manipulate it. … if it is structurally rich we can perform searches or narrow our search to the particular type of information we are interested in.” The focus of intelligent content is to help us improve decision making, perform better and work with more intelligence.

Comments { 0 }

Discovery and Monetization: Two Important Challenges Facing eBook Publishers

Written by Dan Tonkery for
Unlimited Priorities and DCLnews Blog

The publishing industry’s shift from print to e-formats has been growing rapidly. eBooks are the fastest growing segment of the publishing industry. In 2010 eBooks represented over 8% of the trade book sales in the United States. The growing number of devices further fuels the growth rate. By the end of 2011 there will be an estimated 21 million eReaders installed with a double digit growth in eReaders expected in 2012. Sales are forecast to exceed $1 billion dollars.

The growth of eBooks is welcome news to the publishing industry that has been experiencing a decline in book sales and shrinking shelf space as more brick and mortar stores are closing. The big Six publishers are taking full advantage of the eBook opportunity as is Amazon with a range of eBook services.

Given that most publishers are not technology driven organizations, most publishers will have to turn to technology based companies for their digital publishing solutions. Publishers are looking for companies that offer a full range of editorial, composition, and conversion services. In the eBook support area, publishers are looking for high quality, cost effective delivery to eReader devices, smart phones, and tablets. The technology company not only has to help prepare the content for the appropriate eReader format, they must also assist with the distribution to Apple’s iBookstore, Barnes & Noble, Kobo, Sony, and others.

Publishers are rushing to convert their current publications to eBooks and there is also a resurgence of interest in back lists. Several major publishers in the romance field are finding new sales life in their back-lists many of which have been out of print for years. So for companies offering digital publishing solutions there is significant work available in converting back-list or other legacy content into flexible digital formats for republishing in eBook formats.

The silver lining in the rapidly growing eBook market is the opportunity for digital publishing companies to support the publishing community by converting books, journals, catalogs, newspapers, microform, and newsletters to e-content. The entire publishing community from trade publishers, university presses, associations, government organizations, and even the STM presses all need a level of technical support during this exciting time. The traditional players in this market need to market their services and be aggressive in selling their services. With any new opportunity, the marketplace will see new host of players offering services. Such is the case with eBook services. The traditional technology partner companies in this marketplace need to insure that they bring their A game as the new companies entering the publishing market are often venture backed run by seasoned Silicon Valley trained entrepreneurs. Supporting eBook projects is becoming a very competitive market.

Given a steady supply of good content, a stable growing eReader market in place and a market that is rapidly expanding, what is missing in the eBook world? What was worrying publishers at the recent Book Expo in New York City? One of the themes I heard over and over was the issue of “discovery”. Publishers are worried that consumers are not going to be able to find newly published book as the traditional marketing and sales channels are not as useful in the digital age. While the Big 6 Publishers still have large marketing campaigns to promote a few of their best sellers, many other publishers as well as the self-publishing authors are left without an easy solution.

To solve this problem there is an effort underway to develop the next best book discovery tool which allows publishers to suggest unfamiliar content to consumers. There are over 20 start-up companies including BookTour funded by Amazon that are developing software solutions and tools to help authors with book promotion. Three books publishers Hachette, Penguin, Simon & Schuster have teamed up to start the Bookish Reading Group as an editorially independent platform to help with discovery and sales.

The New York Times recently added an eBook best seller list. Sales of eBooks are predicated to hit $1 billion dollars this year. There is a race to build a better discovery tool. Discoverreads (recently purchased by Goodreads and What Should I Read Next? are relying on engineering. Other services such as BookGlutton and Copia are creating a social experience.

aNobii is another new electronic book discovery and retail service owned by publishers and a retailer. The service is a socially-driven retail platform that aims to give publishers more consumer data than provided by digital sellers. The one thing that Amazon and Apple don’t do is help you decide what to read next. Although I would have to say that Amazon does a reasonable job of telling you what people have bought similar to the book you just purchased. So there is an algorithm that looks and compares purchases and brings other suitable candidates for the consumer to consider. Also Amazon offers book reviews which are helpful even if you don’t know the reviewer.

Readers buy books based on four reasons according to Kevin Smoker, the co-founder of BookTour; familiarity with the author; interest in the subject; a recommendation from a trusted source; or hearing about it in the media. Just take a look at the influence of Oprah and her Book Club. A recommendation of any book on her show sent book sales into the millions.

Interest in a subject or familiarity with an author certainly can help sell books on the web. For example, Amanda Hocking, a 26 year old new paranormal romance author is now selling 10,000 books a day and has just signed a four book deal with St. Martin’s Press. Less than two years ago she was a totally unknown self-published author who created a fan base with her paranormal romance genre.

The publishing industry is looking for the next app that will offer a sophisticated tool which will know the reader’s tastes and can make recommendations for your next reading pleasure much like Pandora does with music.

The digital publishing technology companies can help with discovery by helping to engineer and provide book tagging data for search engines to find and utilize. Already most companies are working to capture metadata including titles, chapter titles, authors, editors, volume, issues, page numbers, abstracts, and keywords. Capturing full and complete metadata is an important step in facilitating discovery.

The other challenge facing publishers and authors is finding the proper monetizing strategy. There are continuing debates and corporate fights over setting the price for eBooks. From the beginning, Amazon demanded and received a $9.99 price on all eBooks and forced publishers to accept a much lower list price for their eBooks than print. This pricing strategy went a long way to seed the eBook market and set the consumers expectation for low pricing for eBooks. Then along came Apple with their iBookstore and they set the publishing market on their ear with an agency model which in the end all the Big 6 publishers have endorsed. So eBook prices have increased. Under the agency model the publisher sets the price and Apple takes their 30% cut. Publishers have been more realistic in setting their eBook prices. Amazon has followed suit and now eBooks are priced higher and while publishers are happier with the arrangement, consumers are still smarting about the pricing models.

Self-publishing authors now have a number of viable choices. One of the more popular sites for self-publishing authors is Smashwords where they work with authors to produce eBooks and handle much of the background work and the author keeps about 80% of the retail price. Many self-publishing authors sell their eBooks at a much lower price than the traditional publishers. Smashwords titles are often sold at $2.99 or less. The 10 bestselling Amanda Hocking’s titles are all at $2.99 or less with many at .99 cents. It is amazing that even at that price she was able to sell over $2 million dollars’ worth of books on Amazon.

What is clear is we are in the early days of finding a proper monetizing strategy for trade books. The pressure is on publishers to price eBooks at a level that consumers will support. There is significant revenue in the eBook sales channel but much of the traditional costs will have to be reduced. The Big 6 publishers are facing a tough time ahead with the high overheads.

The STM and professional community is not under as much price pressure as trade books but it is clear that a different monetizing strategy is also needed here. The traditional monograph book is no longer only going to be sold as a unit. Publishers need to work out a pricing strategy that will enable a user to buy the full book, a chapter, a paragraph, a chart, a photograph or any other unit that can be supported. The same technical support is required for a journal or magazine subscription. There is an important role for the digital publishing solutions companies to fulfill.

The technology needs to be in place to support the acquisition of an issue, article, table of contents, or subunit of the journal. An annual subscription to the journal may not continue to be a unit of choice for libraries. I expect the user community to begin buying information in subunits and the technology needs to be in place to support it. Already the major hosting services allow organizations to buy information by the download via a token or some other mechanism. I expect that this trend will be expanded. We need the systems in place to allow users to acquire information in any number of units or subunits.

Publishing is a changing industry and the industry will need the best technical support that they can afford. Companies offering digital publishing solutions are well placed to assist with the changes and take a leadership role in supporting the new user world.

About the Author

Dan Tonkery is president of Content Strategies as well as a contributor to Unlimited Priorities. He has served as founder and president of a number of library services companies and has worked nearly forty years building information products.

Comments { 0 }

Conference Buzz: Special Libraries Association (SLA) 2011

Written for Unlimited Priorities and DCLnews Blog.

The Special Libraries Association (SLA) held its 2011 Conference on June 12-15 in Philadelphia, PA. The conference theme was “Future Ready”, with an emphasis on the need for information professionals to be ready for the future in the midst of all the changes that are buffeting the industry now.

The opening keynote speaker was Thomas Friedman, columnist for the New York Times, winner of three Pulitzer Prizes, and author of The World is Flat (Farrar, Straus & Giroux, 2005). According to Friedman, the major challenge to America today is the merger of globalization and information technology. The global economic playing field is being leveled, and America is not ready for it. He identified four forces that created this situation:

  1. The rise of the PC, allowing individuals to create their own digital content.
  2. August 9, 1995, the day that Netscape went public. Their browser brought the Internet to life and allowed everyone to interact with it.
  3. Transmission protocols and networking, which made everyone’s computer interoperable and connected.
  4. The capability for everyone to upload their content and share it with the world.

As a result, we are in an unprecedented era which is built around individuals and the degree to which they can and must act globally. We are moving from a vertical to a horizontal environment where value will be created by what people create and who they collaborate with.

These three major trends mark the present time:

  • Whatever can be done will be done. Will it be done by you or to you? If you don’t pursue your ideas, someone else will.
  • The single most important competitive advantage you can have is between you and your own imagination.
  • The world is getting flatter and flatter and more and more hyper-productive. CEOs used the recession to become very efficient. The jobs they eliminated are gone and are not coming back. Whatever you do, don’t be average.
  • We are rapidly heading to a world of universal connectivity in which trust, values, ethics, and judgment will matter more than ever, and which will be hugely important to the librarian and information industry.

Opening the Monday sessions, Steve Abram, well-known speaker, former president of SLA, and now with Gale Cengage Learning, delivered a strong challenge to librarians. He said that we have a big opportunity to become the MBAs and CPAs of the next generation economy. We are in the midst of changes that are bigger than the financial or industrial revolutions. Copyright is a major issue because copyright laws will govern how the next economy will work. We must align with what we know now instead of with our old prejudices.

We are at a critical juncture where control is beginning to depend on the device, and our role is moving into a world of sense-making. … There will soon be 150 million books online. Are you ready for that?

The emphasis is not about the technology any more; it is about representing our role in the technology. Librarians make sense of information. We know that improves the quality of questions, and we know that libraries are for learning, discovery, and making progress. What will the end user be like after an experience with our products? Librarians have a vital role in building the critical connections between information, knowledge, and learning. They must be biased toward quality. As technology advances, emboldened librarians hold the key.

A session on misinformation on the Internet drew an overflow crowd. Anne Mintz, author of Web of Deception (CyberAge Books, 2002) said that intentional misinformation has grown since she wrote her book, and it now goes beyond individual websites. Consumer Reports has estimated that annual damage from spyware and other forms of misinformation is now in the billions of dollars. Criminal activity on the Internet is on the rise; it has been fueled by the widespread popularity of social media. Identity theft has also become a much larger problem than ever before; more than 347 million records have been compromised in the US since 2005. The Internet has become much more dangerous than formerly; anyone using it must use sound critical thinking.

It is no secret that mobile platforms are increasingly being used for information dissemination. Three representatives from the Smithsonian Institution reported on some of their pioneering work. The Smithsonian’s strategy envisions the use of shared tools across all its museums, followed by an infrastructure especially developed for mobile initiatives, products, and services. The first mobile app developed for the Museum of Natural History, MEanderthal, allows a user to morph their photo back in time to see what they would have looked like as a Neanderthal. The app has been downloaded 215,000 times in the last 14 months; 90% of those downloads were for the iPhone.

…corporate libraries are an extremely challenging environment in which to work, and part of the difficulty is that they have long had trouble deciding what to call themselves and what label to put to their skill sets.

Corporate libraries have long been leaders in applying new technology, and they have been prominent in SLA. However, with difficult economic times and shrinking budgets, they have been going through a period of severe turbulence. Jim Matarazzo, former Dean of the Graduate School of Information at Simmons College, and Toby Pearlstein, recently retired from Bain & Co., pointed out that corporate libraries are an extremely challenging environment in which to work, and part of the difficulty is that they have long had trouble deciding what to call themselves and what label to put to their skill sets. And in some organizations, they are trying to figure out who they work for. The result is that libraries are not well known in organizations, and when economic difficulties arise, they are one of the first to feel the cutbacks. It is therefore extremely important for the libraries to prove their worth, and corporate librarians must continually be alert not only to what is happening at their firm, but what is occurring in their industry that might affect them. The following questions form a predictive model for corporate libraries; if the answer to any one of them is “yes”, that should be a red flag:

  • Are decisions being made at the top without user consultation?
  • Is the number of library customers declining?
  • Is funding still available for external resources?
  • Is there evidence of a financial crisis in the parent organization?
  • Has evaluation of the library’s services ceased?

As a library manager, you must go out into the “real world” of your company and promote your value. Be aware of where the money comes from and which budget (i.e., capital or expenses). Get to know your financial people very well and help them understand what you do. Look at every one of your services and see if your customers are satisfied with them. Are people really using them? If you stopped doing them, would anybody notice? You can be a master of your fate. Be prepared to participate in scenario planning and position yourself to drive decisions. And assume that every company (including yours) is for sale at any time.

The advice in this session was excellent, and in today’s environment it is relevant for all types of libraries, not just corporate ones.

As usual, there were many outstanding presentations at the SLA conference. Summaries of many of them, such as a stirring challenge on the biggest threat to libraries today (no, it’s not money, it’s copyright practices!), a wonderful and entertaining session sponsored by the Chemistry Division on The Science of Ice Cream, and the closing keynote by James Kane on loyalty are available on The Conference Circuit blog.

SLA 2012 will be July 15-18 in Chicago, IL.

Comments { 0 }

Conference Buzz: It’s What Counts: How Data Transforms Our World

Written for Unlimited Priorities and DCLnews Blog.

33rd Annual SSP Meeting

The Society for Scholarly Publishing (SSP) held its 33rd Annual Meeting on June 1-3 in Boston, MA. Its theme was “It’s What Counts: How Data Transforms Our World.” Last year in San Francisco, the attendance was 595; this year it was 720—a significant 21% increase. My overall impression was that the mood was considerably more optimistic than it has been in the last 3 years since I began attending SSP meetings, and there were considerably fewer of the “How to Survive in These Tough Times” type of presentations. Besides the opening keynote and closing plenary sessions, the conference consisted of 4 concurrent tracks and a well attended exhibit hall.

Plenary Sessions

The opening keynote, by Jon Orwant, Engineering Manager at Google, was outstanding. His title was “Approximating Omniscience,” and he noted that because we have an overflow of information, our abilities to find what we need have not kept pace with what is available. Scholarly publishing is a unique market because it has an excess of both supply and demand. Publishers can reduce information inefficiencies by packaging products in new and innovative forms. Google is working at this in its book scanning project, and Orwant has visualized some of the statistics of this database in a variety of fascinating ways. He compared scholarship to a financial model, in which readers invest time in books and are slowly paid out in ideas. Books are analogous to savings bonds; journals are like mutual funds; and articles are like stocks. Orwant has also developed a “books ngram viewer” which compares phrases that have appeared in books over time, so, for example, one can compare “kindergarten,” “nursery school,” and “child care” and observe that “child care” has become a much more popular term recently. A version of the ngram viewer has also been developed for music. (Warning: This is a fascinating and entertaining website and can consume lots of time!) Orwant concluded his presentation with a list of experiments he would like to see conducted, including development of a low overhead micropayment system for content, apps inside digital publications, and systems to reduce the costs of rights clearance.

In his endnote address, John Palfrey, Professor and Vice Dean for Library and Information Resources at Harvard Law School, focused on four issues affecting libraries and librarians:

  1. Changing patterns of learning. Youth and media are both now born digital, which has led to different practices in information access. Palfrey forecasts that by 2012, we will be more likely to access the web on a mobile device than a PC, and most of the media we interact with will be digital. This is already causing issues of information overload and credibility.
  2. Innovative teaching. Teachers will need to be more in the mode of creating. Many creators of today’s services like Facebook and Google were students when they began creating their services.
  3. Changing patterns of research and publishing. Open access is a major innovation in digital scholarship. Harvard has committed to open access in faculty publications and has begun to facilitate it for student works as well. It has also launched the Digital Public Library of America project.
  4. Changing roles for libraries and librarians. Even the richest schools in America (like Harvard) do not have increasing library budgets; the best we can hope for is that they will remain flat. We must think about sharing our collections differently. No great library can stand alone any more, so we must be more precise with our acquisition policies and determine what we have that no other library does, which we therefore have an obligation to collect. Likewise we need to aggressively create more content online.

Today’s problems and challenges can be turned into opportunities, especially in areas of information creation, empowerment of individuals. We are in a digital-plus era which is having a profound transition in every field.

Concurrent Sessions

Two of the more interesting concurrent sessions that I attended were on startups and information overload. The startup session featured 6 new entrants into the marketplace and showed that innovation is still alive and well in our industry. It was also of interest because it illustrated some of the more pressing user problems of today.

The 6 companies were:

  • Mendeley: organizes a researcher’s downloaded PDF documents and allows annotations, data extraction and highlighting sections important to the user.
  • Scribd: turns any file into an HTML page and has become the world’s largest reading and sharing website. Text of any document can be made searchable by OCR.
  • SureChem: chemical patent search for scientists allowing searching of structures embedded in the text of patents. A database of 12 million chemical structures from 20 million patents can be integrated with an organization’s internal data for unified searching.
  • Deep Dyve: partners with publishers in the sciences and humanities to create low-cost single-day viewing-only “rental” access to single documents. No printing or downloading is allowed.
  • Recorded Future: built largest temporal index in the world, allowing searching on terms such as “next week” or “last month” by applying natural language processing to unstructured text.
  • Bioraft.com: Aggregates publicly available regulatory compliance data on hazards to scientists, tracks what they use, and organizes it for safety.

…quiet study places in libraries are becoming a thing of the past because libraries are now becoming community spaces encouraging user interaction … we need to think of overload as a market problem, not just a user problem…

The information overload session was interesting because it analyzed this common problem that besets virtually everyone in today’s information abundant environment. One often hears people complaining about the overload problem, but this session offered some different views of it and suggested some steps toward a solution. Phil Davis, a postdoctoral researcher at Cornell University described his experiences in writing his Ph.D. dissertation and noted that quiet study places in libraries are becoming a thing of the past because libraries are now becoming community spaces encouraging user interaction. He said that we need to think of overload as a market problem, not just a user problem, and journals can play an important role in signaling quality content. Oliver Goodenough, a Faculty Fellow at the Berkman Center for Internet and Society, said that publishers can help users overcome information overload by providing summaries of conclusions, data, and logic, with deeper analysis behind them. Karen Fisher Ratan, Associate Director, Strategic Development at HighWire Press agreed, saying that we have not served users well but have left them to their own devices. Our products are not targeted at them, and there is a wall between users and publishers. She suggested that David Shenk’s book, Data Smog: Surviving the Information Glut, is a good reference on information overload. HighWire surveyed 45 researchers at Stanford University who had these recommendations for publishers:

  • Respect the workflow.
  • Productivity is more important than novelty.
  • Produce time-saving tools and information.

Summaries of other sessions are available at The Conference Circuit blog, and speakers’ slides will be added to the SSP website.

Comments { 0 }

Conference Buzz: Re-inventing Content, Discovery, and Delivery for Today’s Academic Environment

Written for Unlimited Priorities and DCLnews Blog.

NFAIS 2011

Expectations of today’s academic information users have changed as technology has advanced and new technologies have appeared, so many information providers have re-invented their content accordingly. The processes of accessing and delivering information are considerably different than they were even a few years ago. This NFAIS symposium on May 25, 2011 in Philadelphia, PA examined some of the trends and issues that content providers have faced and the changes they have made to their products to accommodate today’s digital and multimedia technologies. The symposium had sessions on re-inventing content from traditional sources, effects of eBooks and eTextbooks on the learning process, and discovery and delivery platforms. It closed with a fascinating systems analysis look at book publishing.

Integration of Video

One of today’s major trends is the integration of video into all types of content. With the appearance of video hosting sites like YouTube, students have come to expect video content to play a prominent part in their education. In response to this demand, Alexander Street Press (ASP) modified its business strategy in order to concentrate on video-enhanced products. Stephen Rhind-Tutt, president of ASP, reported that the company has translated over 20,000 CD-ROMs into streaming media and has also developed a system to transcribe video into text and synchronize the text with the video images, thus allowing users to quickly and easily scan through the text and view only the portions of the video of interest to them. Other examples of video initiatives by publishers include the American Chemical Society, which developed a very successful video course, “Publishing Your Research 101” that was viewed over 24,000 times in one week and Pearson, a leading educational publisher, which is adding video and podcasts to its eBook products.

Re-Invention of Content From Traditional Sources

The Retrospective Index to Music Periodicals (RIPM) is one of the few content providers dealing with very old content—in this case, music periodicals from the 1800s up to about 1950. Because of the age of the source material, RIPM has several unique problems not generally faced by today’s information companies, such as the poor condition of the pages, handwritten notes on them, etc. RIPM has overcome these problems, producing a database of over 1.2 million pages that has become a major tool for teaching music. The user interface offers several advanced features, such as spelling suggestions, and even the ability to reconfigure one’s keyboard to accommodate non-Roman character sets.

Search vs. Discovery

Search, long a feature of information systems, has several well-known problems, as Bruce Kiesel, Director of Knowledge Base Management at Thomson Reuters, pointed out. It works best when you know what you are looking for, but it only retrieves documents. It cannot find answers to questions, knowledge, new information, or information spread across multiple documents. Discovery systems are making content increasingly intelligent, and they allow users to find unknown information by serendipity, create document maps, and find entities, concepts, relationships, or opinions. Semantic content enrichment can annotate knowledge, link to similar documents, and use metadata as a springboard to other documents, thus enabling information visualization and more proactive delivery. Thomson has greatly enhanced some of its databases using these techniques.

Re-inventing the Learning Experience

A new generation of electronic book products is changing the learning experience. It is no longer sufficient to simply repurpose printed books into a series of PDF documents. Pearson is using Flash technology in its eBooks, and Wiley has redesigned its WileyPlus product, organizing it by time instead of subject so that students can easily determine where they are in a course and can budget their time effectively. It also includes an “early warning system” that uses time and learning objectives to help students find their weak areas and study more effectively. M&C Life Sciences has overcome some of the well known problems of publication delays by selling its content as 50 to 100 page eBooks that include animations and video. Because of their small size and rapid publication schedules, these eBooks can be updated quickly and easily as necessary.

What is a Book?

Eric Hellman, founder of Gluejar, closed the day with a fascinating look at the future of book publishing from a systems analysis viewpoint, examining questions such as:

  • Is the future of publishing related to paper and ink, or bits?
  • Will we be working with documents or objects (like software)?
  • What are the objects in our environment and what are the relationships between them?
  • What will users do with the objects?

Systems analysis involves objects and the actions taken on them. In the publishing world, objects are textual data, articles, or photos, and the actions are navigation, sharing, and searching. Hellman compared a newspaper website such as the New York Times and a general news website such as CNN. The analysis shows that both sites have similar objects and actions (with the exception that CNN emphasizes videos), so they are very much alike. In contrast, single articles and videos are not as similar. An article has text, metadata, photos, and some context and can stand on its own; actions on it include searching and scanning through it. A video is usually focused on a single object with only some context; actions on it include play, pause, change the volume, etc. Applying this analysis to eBooks, Hellman suggested that an eBook is more like a video than an article, although some of them work well as websites. He went on to assert that selling objects has many advantages; the best model is to aggregate them and sell subscriptions because is a good fit with existing book businesses.

More details on this useful and interesting symposium are available on The Conference Circuit blog, and presenters’ slides have been posted on the NFAIS website.

Comments { 0 }

The Changing Content Landscape in Publishing

Written for Unlimited Priorities and DCLnews Blog.

On May 23-24 many of us in the publishing industry will be attending the BookExpo America 2011 in New York City at the Javits Center for another annual coming together of who’s who in publishing. The exhibit halls will be filled with thousands of industry professionals and people who are there just because they love books. Depending on the hall you visit and the booths you stop by, you will come away with a few different feelings about the book publishing industry. Last year I spent two full days at the exhibits and the air was filled with the love of printed books. Publisher after publisher showed no indication that the world was changing or if it was, no one was admitting that the traditional book is under any pressure from technology.

Electronic content is one of the most used applications; the growth rate is compounding each month.

Yet there was a small group of Digital Book exhibitors all banded together in a small area of the Javits Center, which were showing the tools of the future. And if this group is successful in bringing new technology and opportunities to publishing, then the future of the printed book will certainly take on a very different look. Some are even questioning the future of the printed book!

In a very short period of time, the technology companies led by Apple are flooding the market with tools to feed users with eBooks. Apple has sold over 15 million iPads and the iPad 2 sold over 500,000 units the first weekend it was available. Samsung has the Galaxy Tab, Motorola the Xoom, Blackberry the Playbook, Toshiba has a Honeycomb-based tablet in the works, and don’t forget the mainstay eReaders from Amazon or Barnes and Noble. The consumer market is almost replacing the laptop with the tablet computer and consumers around the world are finding all sorts of applications for their new equipment. Electronic content is one of the most used applications and the growth rate is compounding each month.

How is the landscape of traditional publishing changing? The North American Big Six publishers…are all experiencing a continuing decline in sales of trade books at the brick and mortar stores.

The Kindles from Amazon have been accepted by users as their eBook reader of choice. Amazon sold over 7 million Kindles in 2010 and is on track to sell over 35 million by 2012. Amazon reports that its eBook sales are outpacing print sales in the hardcover area at 180 to 100 and in the paperback area 115 to 100. Apple has over 2500 publishers in their iBookstore and has delivered over 100 million eBook downloads. If the eBook readers were not enough to make an impact on print book sales, consider that smartphones can also be used to read content; the iPhone with its 100 million handsets is a major player in the e-content market as well. Other smartphone manufacturers are also enjoying commanding sales growth.

With this type of infrastructure in place it is not so surprising to find a major impact on traditional book publishing. So what is happening out in the market place? How is the landscape of traditional publishing changing? The North American Big Six publishers including Random House, Penguin, Harper Collins, Simon & Schuster, Hachette Book Group, and Macmillan are all experiencing a continuing decline in sales of trade books at the brick and mortar stores. Everyone knows about the loss of Borders stores and the continuing trend of the loss of shelf-space. The NA Big Six had a significant advantage over everyone else in that they were able to put books on shelves. They have been distribution experts followed by strong marketing and editing. Their power is now on the wane as the distribution function in the eBook age is of less value.

Another important function up in the air since the market is now a global is the negotiation of territorial and language rights. Selling rights to publish best sellers in other countries has been a big part of the annual Frankfurt Bookfair. Imagine the impact to the traditional way of doing business when an eBook can be delivered around the globe with the push of a key on a keyboard. No inventory issues, no freight, no customs clearance, and no delay. You want the item, you buy it, and it is immediately available on your tablet.

No inventory issues, no freight, no customs clearance, and no delay. You want the item, you buy it, and it is immediately available on your tablet.

If the NA Big Six don’t look out they are soon going to find a new Big Six taking over their role. Amazon, Apple, Google, Kobo, Ingram and Overdrive could just as easily perform all or most of the functions that the NA Big Six offer. Each of the companies above can deliver an eBook to a user anywhere in the world. They have eBookstores from direct relationships with publishers and have the customer services and maintain help desks that connect the users to their platforms.

Already Amazon, Apple, and Google have demonstrated their power in the marketplace. These three companies have changed the publishing landscape with their impact on pricing and setting terms and conditions. Apple with their introduction of the Agency sales model tore down years of book selling with the destruction of the wholesale model. Amazon exercised their strength on setting the prices for the original eBooks. All of these changes in the marketplace are being watched by a growing number of companies that are looking to take advantage of the new opportunities in publishing.

The timing is right for companies outside of our industry to come in and shake up what has been an old boys’ club.

The timing is right for companies outside of our industry to come in and shake up what has been an old boys’ club. Venture backed companies are betting that using modern tools and techniques, they can have an impact on the future of book publishing. There has been a host of high-quality self-publishing companies that have sprouted up offering a full range of publishing services. One company in particular has caught my attention: Smashwords, the brain child of Mark Coker a successful entrepreneur. They offer a full service operation that can get your eBook published in any platform and the author keeps 85% of the price instead of standard 25% royalty. Other self-publishing companies are Scribd, Author Solutions, and Amazon’s CreatSpace or Kindle Direct Publishing group that is offering potential authors a full publishing solution. These are just the first of many companies that are going to be supporting authors bypassing the traditional mainstream publishers.

While trade eBooks are perhaps the best selling segment of the marketplace, it is interesting to note the changing landscape in textbook publishing. Textbook publishing is on its way to having an extreme makeover. Some industry experts that are predicting that within 5-7 years the digital textbook will reach its tipping point and that eTextbooks will become the dominant format. The popularity of the iPad and the other tablets as well as the adoption of EPUB3, OER, and Open textbooks will help drive this shift to eTextbooks. Unlike trade books where the traditional publishers are losing market share, it is the major textbook publishers such as McGraw-Hill, Pearson, Wiley and others that are driving this market. The textbook companies have invested in new companies such as Inkling to support their business objectives. Another group supported by the publishers is CourseSmart.

For years, traditional textbook publishers have lost revenue to the sale of used textbooks. Textbooks have been difficult to update as the process of editing and reprinting anything less than five years old has been expensive. Electronic textbooks can be updated every year with new data added. There is hope by many students that the e-textbook can be priced at a lower price, offer a range of new features such as online editing, cut and paste, and note-taking support. Publishers will be able to produce eTextbooks at a lower unit cost and be able to tailor-make editions for different markets.

No discussion about the changing landscape in publishing is complete without the mention of the opportunity to truly create new works. Publishers are going to begin creating highly-accessible interactive content. There are a wide range of devices such as the iPad and the smartphone where enhanced content can be viewed and consumed. There is a need for software companies to support the publishing industry in the conversion and creation of multimedia e-products. Books, journals, newspapers, and magazines all are fertile ground for technology companies to assist publishers in the production of their electronic products. There is an active and growing market out there for companies to assist in building new products or converting old content to the appropriate format. Make sure your technical staff is ready for HTML5 and EPUB3.

Nearly every publisher that I visit or work with is faced with the same predicament. Most publishers have plenty of content sitting around in large backlists, an active publishing program that they want to convert over to eBooks, or new ideas for interactive products. What I don’t see is a level of technical expertise within the publishing houses. Every publisher from Random House, to the New York Times, to the University Presses needs help in achieving their digital potential. This missing skill is a fact that the venture community has recognized and we are seeing many start-up companies getting into the market to support this critical need.

If your company is offering technical support to the publishing community be sure that you are offering the latest and greatest software solution. Be on the lookout for new startups that are hitting the market such as Vook, a new media company that is working on video integration with eBooks. The publishing industry is moving rapidly to transition to the new market opportunity provided by the devices supporting e-reading. No one is standing still. The market, the technology, and the opportunities are moving rapidly.

See you at BEA 2011. Have a good show!

About the Author

Dan Tonkery is president of Content Strategies as well as a contributor to Unlimited Priorities. He has served as founder and president of a number of library services companies and has worked nearly forty years building information products.

Comments { 0 }

Health Information Technology (Part II) — The End User

Written for Unlimited Priorities and DCLnews Blog.

Debra Spruill

Debra Spruill

With people getting used to easy access to information — and with automation of health records being one of the lynchpins of controlling healthcare costs, you would think there would be more progress — and technologically, there is. But maintaining computerized health care records has its own set of issues, many of them non-technological. Aside from privacy issues, there are additional factors such as the variety of sources for a person’s information, the subjectivity of much of the information, the value of including handwritten notations, and the reluctance toward fully shared information between doctor and patient. These are all issues that Debra Spruill discusses in the wide-ranging second part of Health Information Technology.

A recap — In the first article of this two-article series, the focus was a comparison of the impact the Healthcare Information Technology has on the medical community and how it paralleled the similar revolution for libraries that began in the 1960’s. I revisited the rise of BRS, Bibliographic Retrieval Services, from its beginnings in 1968 at SUNY Albany as Biomedical Communications Network (BCN). Then I reviewed SDC, Systems Development Corporation, and how it evolved from a government contract with the United States Office of Education to disseminate educational information (ERIC). SDC later developed ORBIT and NLM adopted it for its MEDLINE product. And in 1972, Dialog became a commercial online service with its strength in the science field.i All these developments served as the roots of what became known as the information industry and changed the library world forever.

I went on to demonstrate that the healthcare community had much in common with the library community. They both provide services to a varied base — libraries service public, special, government, special needs, and private organizations; healthcare provides serve groups, large regional organizations, clinics, mobile services, and special needs, etc. Ultimately both industries service the needs of individuals — whether they be patients or patrons and regardless of how their needs are presented to the respective organization. In hospitals a patient may walk into a physician’s office or clinic, be referred by another practitioner, through an emergency admission, or in a clinical trial. A library may have a patron walk in, telephone, send an e-mail inquiry, locate their collection through an Internet portal or service, be referred through a 24/7 service, or assist an instructor in study aid development. It is the similarity in servicing the end-user/patron/patient that we will explore in this article.

In addition to the diversity of organization I explored the paradigm shift of long-standing services and time-tested methods being uprooted by new methods and/or technologies. I raised the concerns of professionals whose skillsets had to be modified and sometimes augmented with new skills and tools, specifically as it related to technologies and methodologies. Education and training programs had to be overhauled to meet the new demands. And they continue to require review as new mechanisms emerge, e.g. social networking, mobile platforms, tablets.

In closing I touched on the topic of privacy — one that proved a critical issue in the information community and is certainly a concern in the healthcare community. It is here where we pick up.

The End-User — Call It Patron or Patient

While the library serves patrons and the healthcare community serves patients ultimately they are approaching their client base similarly as the end-user. In other words, the patron or patient is ultimately who they aim to satisfy.

When considering the end-user the library community was challenged with recognizing that the tools that had been developed for their profession were not necessarily those for patron use. These tools, in fact, were developed for and often by the professional, to access, record, and generally be used to provide information to a patron inquiry. A patron would come in or call or send a request stating what it was they wanted; the process was a very results oriented method. The patron did not assume to know what tool or resource was best nor did they necessarily care how the answer or solution was provided. Their interest was in getting the right answer and receiving it in a timely fashion. So if the patron was interested in learning what new materials were being developed for a given technology, for instance what plastics are now being used for kitchen appliances, they would simply ask the librarian and say I’m interested in finding out what plastics are being used with kitchen appliances. They may or may not indicate whether the appliance in question was for a professional restaurant, or whether it was meant for marine use or one in home. And even in the home is it for a base single-family home or for mobile home? This type of information would generally be defined during the interview process with the professional librarian. Now some may remember what these interview processes were like. They were an iterative process for the librarian and other reference professionals to utilize to determine with specificity what it was that the patron really wanted. This iterative process enabled the librarian to recognize which resources would best meet the inquiry’s needs. And they could determine if this was a tool in their collection or whether they might need to borrow something from another library. It would also assist in determining how quickly they could answer the question. This was a method that became refined over the years. These methods were part of the reference desk tool-kit; often with specific written instrtions to assist those on duty.

With the advent of early online tools it became even more necessary for the librarian to work with the patron to determine exactly what was requested. Why? Because the new tools in use were not inexpensive and demanded familiarity with the database(s) and search mechanisms to achieve results. Unlike today’s browser tools, one could not simply put in a series of terms to search. Boolean logic combined with unique search services might require construction of search instructions for separate databases. In fact, it was not uncommon for the same database to have different fields available depending upon the service providing it.

Another major element was the issue of cost control and budget monitoring weighed heavily. One did not frivolously utilize telecommunications, paper, and staff time. Costs had to be justified.

Patron Access to Electronic Health Records — What Does It Mean?

So what is the parallel in the field of electronic health records? What does patient access mean and what does it imply?

It means that now the professional, whether doctor, nurse practitioner, or dentist, is being placed in the position of making information available to the patient that has never been shared previously; except in verbal communications. While this category of data is identified as patient information, it has actually been anything but. It has historically been the healthcare provider’s information about the patient not to readily available to the patient.

Patient information encompasses a very wide berth. It could be the lab tests ordered, test results received, physician notes, consultation notes, opinions by the physician about the patient, consultant physician comments, and myriad other types of information. This information, while collected has not generally been shared with the patient. If a nurse made a notation in the patient’s file overnight for the physician to read in the morning it might never be shared with the patient. This was for the physicians eyes only. Other than the health professionals no one else may ever have been able to see the information recorded about the patient. There are many discussions being undertaken within the medical community around the topic of electronic health records about the sharing of patient information today.

Physicians may be reluctant to share all their notes and observations with a patient. There is concern it could undermine the confidence of a patient with their physician. There is concern the notes could discourage or alarm patients in certain settings. Each patient may or may not be able to cope with the full force of information held in their files. What information and when to share it is at the core of the discussion.

In addition, the issue of information accuracy is paramount to the discussion of electronic health records. It is deemed to be the greatest challenge facing the medical profession in providing patient information.

Without question the major hurdle is the provision of accurately matching patient health information with the myriad sources from which it would be derived. Again, determining that John Smith’s lab tests are properly assigned to the correct John Smith will be daunting. And what of names with spelling variations, e.g. Chinese name structures where the Western version of first name, last name is inverted. While it’s recognized that accurate matching and providing health information for patients has benefits such as improved patient care, improved patient safety, better efficiencies, improved fraud detection, better data integrity, the provision of this information has unparalleled challenges.ii

The Department of Health and Human Services Office of the National Coordinator for Health Information Technology has a privacy and security policy committee focusing on these issues exclusively. The goal is to provide patient access to health information within four days. The objective was once provision of a patient summary; it is now provision of patient access “on demand.” However the Health Information Technology (HIT) Standards Committee has yet to define the standards, what constitutes relevant information is unclear.iii

What Other Players are in the Mix?

When libraries were challenged with this world of new technologies, there were several players that impacted how service was provided. There were telecommunications (until 1984, AT&T was the only phone companyiv), distributors such as BRS, Dialog, SilverPlatter, etc., publishers such as Wiley, networks such as SOLINET, PALINET, etc., which were all organizations that affected how data was distributed, organized, and how users were trained.

So who are the other players in the complicated electronic healthcare world? There are the myriad components of the healthcare community — physicians, hospitals, clinics, pharmaceutical firms, federal, state and local governments, laboratories, public health agencies, EHR vendors, and patients. Each has a voice in how this new environment will shape up.

What are the challenges being dealt with? Medication reconciliation, submission of immunization data, drug formulary checks, drug and allergy checks, submission of reportable lab data and reconciliation with orders, clinical decision support, and exchange of clinical information.

How is the National Health Information Technology initiative organized?v It consists of Federal Advisory Committees that fall under two main umbrellas, Health IT Policy Committee and Health IT Standards Committee. Within these committees are various workgroups, such as clinical operations, privacy and security, implementation, vocabulary task force, meaningful use, information exchange, enrollment, governance, etc.

The committees are comprised of participants across the full spectrum of the healthcare community — physicians, business people, EHR vendors, healthcare unions, academia, legislators, public health agencies, nurses, hospitals, legal authorities, pharmaceutical companies, insurance companies, armed services, and clinics.

This Health IT Standards Committee fully recognizes that the challenges facing patient matching are critical. They acknowledge that it is not possible to achieve perfection in matching patient information but that every effort must be made to eliminate errors and misattribution. They concede that inaccuracy is not just a technology problem-it is also a people problem. They recognize that the quality of the data provided can prohibit accurate matching of information when that data is poor. There is no “one-size-fits-all” solution. And as the data becomes further removed from its source the challenges increase. Add to that multiple sources of data and the challenge multiplies even more. While the use of universal identifiers would be helpful it does not provide the final answer either.vi

Conclusion

So where will Health Information Technology (HIT) lead us? Well, I believe the genie cannot be put back into the bottle. Health Information Technology is an advance that we as a nation, as patients, as providers, and as care-givers, need. As a mobile society we need to have our health information travel as readily as we do. As a technologically savvy society, we need to have health information be current, accurate and exchangeable. This last bastion of critical care information needs to move into the 21st century with all other content. We need to realize the cost savings promised, the improvement in healthcare foreseen, and the advances in managed patient care assured.

And finally, I recommend that we all tap into the information available through The Office of the National Coordinator for Health Information Technology. There may be a role we can all play.

References

i. Bjorner, Susanne, and Stephanie C. Ardito. “Online Before the Internet: Early Pioneers Tell Their Stories.” Searcher June 2003. www.infotoday.com/searcher/jun03/ardito_bjorner.shtml (accessed October 7, 2010)

ii. U.S. Department of Health and Human Services, The Office of the National Coordinator for Health Information Technology, Health IT Policy Committee: Recommendations to the National Coordinator for Health IT, healthit.hhs.gov/portal/server.pt/community/healthit_hhs_gov__policy_recommendations/1815.

iii. “HIT Exchange discusses EHR certification with CCHIT Chair Karen Bell, MD, MMS,” EHR Decisions, ehrdecisions.com/, March 4, 2011

iv. Wikipedia contributors, “Bell System divestiture,” Wikipedia, The Free Encyclopedia, en.wikipedia.org/w/index.php?title=Bell_System_divestiture&oldid=414141380 (accessed March 15, 2011)

v. U.S. Department of Health and Human Services, The Office of the National Coordinator for Health Information Technology, healthit.hhs.gov/portal/server.pt/community/healthit_hhs_gov__home/1204

vi. U.S. Department of Health and Human Services, The Office of the National Coordinator for Health Information Technology, Health IT Policy Committee: Recommendations to the National Coordinator for Health IT, healthit.hhs.gov/portal/server.pt/community/healthit_hhs_gov__policy_recommendations/1815.

About the Author

Debra Spruill is a consultant in the field of preservation with an emphasis on digital preservation. She was recently Director, OCLC Preservation Service Centers responsible for strategic, business development, operational, and contracting for its four Centers, including on-site locations. She was also responsible for client contracts. Most recently, Ms. Spruill was named to the Library of Congress ALTO XML Schema Editorial Board. Ms. Spruill is a member of the Unlimited Priorities team.

Comments { 0 }