Blog

Text Data Management and Analysis – Zhai and Massung

Search is not applied magic, though it is certainly applied mathematics. The standard textbooks on the science and application of information retrieval date back to the period from 2008 to 2012. Over the last decade the extent of the research into information retrieval optimisation has been very significant, even if this is not obvious to users of search applications. In addition the boundary between ‘enterprise search’ and ‘text analytics’ has become increasingly blurred, to the benefit of all concerned. The problem with the standard textbooks is that the extent to which they bridge the chasm between information retrieval and search is limited, with very few examples of how the underlying mathematics translates to real life.

The chasm has now been bridged very successfully by Professor ChengXiang Zhai and his student Sean Massung at the University of Illinois at Urbana-Champaign in Text Data Management and Analysis, which is published by Morgan & Claypool for the Association for Computing Machinery (ACM). In effect this 500 page book is the printed version of MOOC courses in text retrieval and text mining that were first offered in 2015. The benefit of these antecedents is in the clarity of the text in both the writing and the layout. The tagline text is ‘A practical introduction to information retrieval and text mining’ and  the content certainly matches the marketing. The book is divided into four parts.

  • Overview, with some of the core principles needed to understand subsequent chapters
  • Seven chapters on text data access
  • Eight chapters on text mining
  • A short section on unified text data management and analysis

It is not possible to get away without some applied mathematics but where this is required the presentation is clear enough for readers without a grounding in the mathematics of probability and computational linguistics to follow the issues being presented. As the authors note this book is much wider in scope than earlier books, covering topics such as probabilistic topic modelling and also showing clearly the intersection between not only search and text mining but also the integrated analysis of textual and non-textual data. In addition there is a companion toolkit, MeTA, which implements many of the techniques presented in the book and is also integrated into the exercises at the end of each chapter. The toolkit has been widely used by students on the MOOC course so clearly is a robust application. The book is available in both print and e-book formats. The benefit of the e-book version is the internal linking to references and to diagrams but you will probably find the printed version easier to browse through. The book has an excellent index.

This book has been published at a time when the speed of convergence between search and text analytics is increasing very rapidly. Don’t be put off by the exercises – the book will be certainly be of value to students on computer science courses and on more advanced degrees in information retrieval. My experience suggests that many IT managers with responsibility for enterprise search certainly have a background in computer science but never had the opportunity to get into the level of detail needed to fully understand how search and text mining applications achieve apparent magic. This book will be of considerable benefit to them. It will also provide support to open source search developers who have the coding skills to work with Lucene, Solr and Elastic but may not have a full grasp of the underlying science of text analysis. It is certainly not the case that all search and text mining applications work the same way! Readers of this book will begin to understand that ‘search’ is actually a set of components, that each of the approaches selected by vendors (and open source developers) has benefits and challenges and that getting the best out of any search application takes more than just playing design games with the user interface.

Martin White

 

 


Digital and Marketing Asset Management – Theresa Regli

For many years Lou Rosenfeld (Rosenfeld Media) and Tony Byrne (Real Story Group) have not only delivered high quality books and reports but have re-invented their business strategy to reflect changes in reader and subscriber requirements. These two entrepreneurs have now joined forces and the outcome is a classic example of synergy. The reports that the Real Story Group publishes are noteworthy not only for the quality of the vendor/product profiles but also for the extended introductions that provide very valuable context to the profiles. Digital and Marketing Asset Management is the first title in a new series of books from Rosenfeld Media in which the introductions have been transformed into a stand-alone book format. The Real Story Group profiles 34 DAM vendors in its profiles report so this is quite a significant and very competitive market.

Theresa Regli has established herself as an authority on the digital asset management (DAM) market and is a frequent speaker at the Henry Stewart DAM conferences. This 230 page book is an exceptional piece of writing, as I would have expected from someone who started their career as a journalist. The result is a seamless blend of how to manage DAM projects and how to select and implement DAM services. The initial three chapters set the scene, leading into a DAM maturity model based around a consideration of people, information, systems and processes. Then follow five chapters on the underlying technology of DAM applications, including a good discussion of on-prem, cloud and hybrid delivery options. In Chapter 9 the heading says it all – ‘You are not just buying a tool: strategic considerations’. The book concludes with a set of scenarios to use in comparing the technology solutions available and some reflections on DAM in the digital marketing mix.

I don’t have a copy of the RSG DAM report so I cannot tell how much change there has been in the journey from report to book but certainly the book shows no evidence of the text being cut and based from the report. I was delighted to see that there is a very good index which is essential in a book of this type that readers will want to dip into from time to time. The book is available in both print and ebook format at $39. There is also a companion website. I loved the tag line of The Real Story About DAM Technology and Practice.

The most important attribute of this book is that it is written by an author who started in DAM consulting in 2008, and the experience and insights just shine through. In addition Theresa treads the difficult line of being intelligible to marketing managers and yet solid in the technology with great skill. I know from my own experience with Enterprise Search just how challenging that can be. I started to read this book knowing little about DAM but ended up with a very good understanding of the attributes, benefits and challenges of DAM systems. Definitely a book for the top shelf of the bookcase along side my desk, and I am looking forward to future titles on Web Content and Experience Management and on Enterprise Social-Collaboration Technology.

Martin White

 


Gartner Digital Workplace Summit, London, 21-22 September

Gartner Summits and Symposia are a core element of the way in which the firm delivers advice to its clients. It is therefore not appropriate to judge the event as a ‘commercial’ conference even though (like me) there were some external delegates. The content of the presentations also includes research that is proprietary to clients and so I am not going to comment on specific presentations in this review. The Digital Workplace Summit is a re-branding and re-positioning of the long-standing Portals and Collaboration event which I attended a few years ago. The event attracted probably around 400 delegates from across Europe, mostly senior managers in IT departments who are the primary contact points for Gartner services.

Gartner is clearly making a significant research commitment to digital workplace development. Many of the presentations used data collected from a number of large scale surveys that Gartner undertakes, together with on-going discussions with its clients. This research does provide an important underpinning of the advice provided, something sadly lacking in this arena where there is a tendency to scale up to a generic position from a single case study. There were a number of external case studies and presentations from sponsors including one from Robert Leeson, Head of Service Design and Transition at Vodafone. Serving over 110,000 employees across the world what was immediately obvious from the presentation was that the rate of progress and success were because of the support from the CEO, the CTO and the global HR Director. This support extended to them blogging on a regular basis on Yammer. I was especially struck by the way in which Vodafone had given 250 Digital Ninja millennials the responsibility of mentoring 200 of the most senior managers in how to get the best of digital technologies. A quite brilliant idea that could be of immediate benefit in any organisation.

The Summit was also an opportunity for Gartner to introduce its assessment framework for digital workplace maturity. This is designed for self-assessment benchmarking and as the basis for on-going discussions with Gartner consultants. Although focusing on only a few aspects of the digital workplace I was impressed with the balance of IT, employee and organisational issues. I did get a sense that the focus needed on employee and organisational issues was somewhat novel to the presenters, who tended to have a somwhat one-dimensional view of organisational culture but I’m sure this will broaden out as the DWP programme develops.

There was some very good advice given on cloud-related issues. One of the presentations was about the benefits and challenges of moving search into the cloud as either a hosted service or SaaS. Another focused on the challenges of migration to either Office 365 or Google Cloud, and highlighted the problems of negotiating a contract with Microsoft. Little mention was made of the role of intranets, though there was a Roundtable Session on Redefining Your Intranet for the Digital Workplace. I thought it would not be appropriate for me to attend! I was however surprised that there was just that single presentation on search. Of the 400 delegates only perhaps 25 were in the search presentation and no more than 50 in the Sinequa presentation on the move the company is taking to enhance search with analytics and machine learning.  As well as the consultant presentations and the case studies there were two superb external speakers. Sahar Hashemi talked about how she started up Coffee Republic as an example of innovation, and Stefan Hyttfors was equally inspiring on how the nature of work is changing.

Overall I gained a great deal from the event. It was very helpful to have a research-based view on how digital workplace adoption is proceeding. It is clear that good progress is being made when there is a clear commitment to changing working practices from the most senior levels of an organisation. Bottom-up attempts to improve ‘productivity’ by just implementing more technology may bring short term glory but no longer term impacts. One of the challenges is that few CIO/CTOs are on the main Board of an organisation and so are not in a position to sell the benefits across the Boardroom table, and have to leave it to others to do so. The 2017 Digital Workplace Summit takes place in London on 25/26 September.

Martin White

 


1996 – the Year of the Intranet?

On 30 September I will be giving a presentation at the Intranet Now conference about the history of intranet development. The presentation will be based on the text of a chapter for the intranet handbook being published by Kristian Norling, Intranatverk, later this year. 2016 is an appropriate time to be looking at the history of intranets because for me 1996 marks the year when intranet technology  really made the headlines and a significant number of books, reports and technical articles were published.

Among the many books on intranet management published in 1996 were

  • Intranet Working, George Eckel and William Steen, New Riders Publishing,
  • The Corporate Intranet. Ryan Bernard, John Wiley & Sons,
  • Running the Perfect Intranet. David Baker et al Que Publishing
  • Internet et l’entreprise Olivier Andrieu, Eyrolles, Paris
  • L’avantage Internet pour l’entreprise Jane McConnell and David Ward-Perkins Dunod, Paris
  • Building an Intranet. Tom Evans net Publishing
  • How Intranets Work. Paul Gralla, Ziff-Davis Press
  • Intranets as Groupware, Mellanie Hills, John Wiley & Sons.

It is interesting to note that John Wiley & Sons, one of the leading global publishers, had two intranet books on its list, both of which were probably commissioned in 1995.

However was Business Week that set everyone talking about intranets in early 1996. In a feature article by Amy Cortese in February 1996 the benefits of intranets were clearly set out with a number of case studies.

“For now, most intranet Web sites are used for basic information sharing: publishing job listings, benefits information, and phone directories, for example. Some of these simple information-sharing setups already provide strategic advantage, though. Cap Gemini’s Knowledge Galaxy is a giant repository of technical information that helps the consulting firm respond more quickly to customers, for example. More sophisticated intranets are coming. They will let employees fill out electronic forms, query corporate databases, or hold virtual conferences over private Webs. Corporate information systems managers are “just now seeing [the Web] as the next step in application development and distribution,” says Greg Sherwood, National Semiconductor’s Web coordinator and chairman of the chipmaker’s World Wide Web council. For a taste of the future, check out Silicon Graphics. Using its intranet, dubbed Silicon Junction, the company today accomplishes such feats as making accessible more than two dozen corporate databases that employees can traverse by clicking on bright-blue hyperlinks. Previously, to get the same information, an employee had to submit a request to a staff of specially trained experts who then would extract the requested data from the company’s databases–a process that could take several days.”

The impact of this article was quite significant given the readership of Business Week at the time was around 6 million. The reputation of Business Week was probably at its peak and undoubtedly many managers read the article and started to plan for an intranet future. The most notable development in 1996 was the visible commitment of NetScape, Microsoft, IBM, Oracle and Amdahl to intranet technology. NetScape, Microsoft and IBM all made public announcements of their intranet technology strategies in June 1996, with Amdahl and Oracle following on in August 1996. The Gartner Group were certainly taking the intranet seriously. In September 1996 the company published a 50pp report entitled Creating an Enterprise Internet and Intranet Policy. Although there is a heavy emphasis on security management it is clear from the text of the report that the Gartner Group not only recognized the potential of intranets but was also pushing hard for companies to take an overall perspective on web and intranet policies. Twenty years later that remains very uncommon.

In many ways 1996 was a false dawn. The technology companies soon realised that there was little revenue for them in intranets. The technology was really not that complicated, especially when Microsoft bundled Front Page into Office 97 followed by arrival of  HTML 4.0 at the end of 1997, starting an era of ‘build-your-own-intranet’. For another perspective on intranet history see this blog post from ChiefTech

Martin White

 


Intranet products – are they right for you?

Last week intranet consultant Sam Driessen blogged about a number of intranet technology trends.  I was very interested in his comments on the rate at which intranet platforms were emerging, many on SharePoint. Sam Marshall has published a research report on some SharePoint products (in the process of being updated) and the Intranetizen team provide profiles on some intranet products. Over the years a number of my clients have adopted intranet products with success. In one case we got a proof-of-concept intranet up and running in Kuwait in three days.  In each case the process of writing the statement of requirements and then selecting the vendor was carried out with considerable care. Last week I had a discussion with one of the many SharePoint product vendors which was, shall we say, interesting. Over lunch in London this week with James Robertson we spent some time discussing the development of the vendor market and the license basis (often per user per month) that they adopt. When Sam Driessen published his blog I added some comments to his excellent post. Following my discussion with James I thought it might be a good time to put some comments on my own blog

For many organisations these intranet products make good sense, but I wonder how many implementations are a reaction to small intranet budgets (because the organisation has no commitment to information as an asset), wanting to pull back control of the intranet from IT, frightened by the stories about SP development costs or wanting to have a product for which they feel they have a chance of influencing the development roadmap. In themselves good tactical reasons but not strategic.

Here are 12 of the issues that you ought to work through and be very certain you have good answers. To many of these questions the vendor may respond that this is information they do not disclose. Ask them why? You are betting your reputation and career on this decision, especially if you are trying to get around the IT strategy and procurement rules.

  1. Does the license model make sense? Read the small print and cancellation clauses, and make absolutely sure you know what is not included in the base price.
  2. What is the product development road map for the duration of the initial contract and will you be able to suggest new functionality? How much notice will you get of changes?
  3. How much professional service support will these vendors be able to offer within in the price point? If the vendor gets very busy (and the best of them will) where else will you get support from? Scaling professional services support is a cash-flow nightmare as you need to have the people in place before there is the revenue stream to support them.
  4. Will you have a named person as your link with the client post the implementation? If so, can you meet them before the contract is signed? If not, why not? It might mean they are not yet on the payroll.
  5. What is the procedure for escalating problems that the vendor needs to fix?
  6. If you need some customised code (you will!) who will develop and test it, and who will actually own it? It may not be you. The vendor may want to offer it to other customers.
  7. If you operate in more than one country who is going to provide country-level support, such as contributor training?
  8. If not now then at some time in the future you will want to link to other enterprise applications, for example for employee self-service. Exactly how will that be managed and has the vendor experience with the specific version of the software your organisation uses?
  9. A core element of an intranet is search, and all of the products I have seen (especially those based on SP2013) have poor (I’m being kind!) search implementations. Remember you cannot check search is fit for purpose until all your content is loaded. So have a get-out clause if it is not fit for purpose. Don’t assume that the search application you have bought can be extended to other (even SharePoint) repositories.
  10. Is there an active user group which is not under the direct control of the vendor? If there is, can you go along to a meeting? If you can’t, then why not? And if there is no user group then ask why.
  11. What happens to your content when you either cancel the contract or the vendor closes down or is acquired?
  12. What User Acceptance Testing will be carried out, who decides what the pass/fail criteria are, and what happens if there are too many fails? Will UAT include verification of the security model?

This is not a new approach to intranets.  OrchidNet has been around for 21 years and I installed my first intranet product in 2003. So I know there are in fact more than 12 issues but there is a limit to my generosity :-) Some of the products I have seen are very good, and I’m sure that they will evolve into widely adopted products. However the market is going to be very competitive on price as the OOTB functionality is pretty much the same. Probably the most important, and difficult, question you need to have an answer for is what happens if your vendor runs out of cash.

Martin White


Autonomy gets a new owner – but what about those HP law suits?

In the late 1970’s I was working for the New Product Management Group, a London-based innovation consultancy based on a well-established patent agency. The Chairman, Dr. Basil Bard, had been CEO of the National Research and Development Corporation (NRDC) and was an expert in software licensing agreements. Micro Focus Ltd was one of his clients, and I can remember a very enjoyable lunch with Basil, Brian Reynolds and Paul O’Grady (the founders of Micro Focus) about the future of the UK software industry. Brian and Paul sold out long ago but for some reason I’ve always followed the development of the company.

I was of course very pleased to see that Micro Focus International PLC had pulled off a deal with HPE (aka “Hewlett Packard”) to acquire (technically a merger?) a substantial proportion its software assets, including Autonomy. You will recall that following the acquisition of Autonomy by HP quite a number of law firms and accountants were hired by HP to try to find out how a business valued at $8bn seemed to actually be worth substantially less. Unless I have missed some of the action none of these have yet come to court. There was a news story on Bloomberg that the main case would be heard in 2018. In any acquisition the acquiring company always seeks an indemnity about potential legal actions, as these could have substantial financial and reputation implications.

So what is now going to happen? Will HPE withdraw the actions or proceed to the court case?  If I was Micro Focus I would not be happy about the latter option and if I were a stakeholder in HPE I’m not sure I’d be happy for the cases to be withdrawn. Another option is an out-of-court settlement. Because Micro Focus is a quoted company it will have to disclose the terms of any agreement over these legal suits to its shareholders. Intriguingly the lawyers for Micro Focus are Travers Smith LLP who are one of the flagship clients of BAInsight, so they will know a good search application when they see one! My guess is that HPE will want to draw a line under the entire affair, and that would be in the interests of not only Micro Focus but also of Mike Lynch and his former colleagues at Autonomy. One of these days the full story will be told but in the meantime it will be interesting to see what Micro Focus does with the Autonomy search software and its related assets.

Martin White


Unstructured text? It does not exist!

Along with ‘killer application’ I’d like to ban the description ‘unstructured text’. If a piece of text is going to have a semantic meaning then it has to be structured. The challenge for the search application is to take a sentence and parse it (i.e.reduce it) to a set of component words along with their syntactical relationship, such as subject, object and verb. How well the application responds to the challenge has a significant impact on the quality of the search. Understanding some of the consequences of parsing (derived from the Latin pars = part) is very important in understanding why certain queries results in poor user relevance. Different languages present different problems, which is why cross-language search is very difficult to deliver. In the case of English synonyms are a challenge. Take as an example

“Martin was sounding off about the fact that making a sound (perhaps by sounding a bell) is a sound thing to do entering a sound after sounding its depth.”

In that single sentence the word ‘sound’ is used in six different ways. A typical English noun has two forms (singular and plural), a typical German noun has eight forms (singular and plural in four different cases), and a typical Hungarian noun has several hundreds of forms. Using an English parser on German text is ineffective because the parser will not have the set of rules needed to parse German compound word nouns. There are a number of parsing applications for each of the main languages that use an extended ASCII character set. Moving into Chinese, Japanese and Korean (collectively referred to as CJK) is a very significant leap in complexity. To get a sense of the complexity of parsing the FAQ for the Stanford open source parser is a good place to start, as is a special issue of Computational Linguistics. The reason for highlighting parsers is that there could be words or phrases that are important to search users that cannot be resolved with the parser supplied with the search application.

Stemming and lemmatization

Another element within the text processing stage of a search application is recognise variant spellings which are semantically the same. The most common example is the plural of the word, so that a search for ‘cars’ will also find ‘car’. However stemming is a piece of brute force programming and usually only applies to the end of a word. In general stemming increases recall at the expense of precision. For example the standard Porter stemmer (dating from 1980) will reduce operate, operating, operates, operation, operative, operatives, and operational to ‘oper’. Proper names are also a challenge. Stemming will not distinguish between Christian and Kristian. Ideally users querying ‘Christian’ be offered ‘Did you mean Kristian?’. Another case in which stemming might not work is with a word like ‘contractual’. So someone searching for information on how to write a specific contract might search for ‘contract’ but would not find any reference to ‘contractual’. This is where lemmatization comes in, which is a way of recognising the root of a word, in this case ‘contract’. Probably the best concise account of stemming (and its converse and equally important ‘word expansion’) and lemmatization has been published by Idea Engineering

Implications for search managers

The implications of stemming and lemmatization are considerable and need careful consideration by the search team. Indeed the complexities of language are why there needs to be a search team as it will take someone with a background in computational linguistics, information retrieval or information science to not only understand the potential challenges but also to come up with solutions. A good starting place is to build a test collection of content that is representative of some of the linguistic challenges presented by the organisation. Some of these are about specialised terminology used in organisation. Lawyers often refer to ‘matters’ not cases. No-one is going to search for ‘matter’ as a verb so the ranking of these specialised business-related terms has to be accommodated. Reviews of search results should also take into account where a failed or poorly performing search is the result of a stemming problem.

Once you get into searching across more than one language (and that includes British English and American English!) the complexities mount up very quickly. Good search applications will recognise the language of a piece of text and initiate an appropriate set of parsing and stemming tools but are you be confident that not only is your application doing this but doing it well enough to meet the expectations of search users? Moreover will the application can cope with names where there are linguistic variants because these will generally not be picked up by the linguistic recognition software. Basis Technologies publishes excellent guidance notes on all aspects of the linguistic issues around search. Of course open source search solutions will enable you to choose which parser to use, but how will you decide which of the many available will be best for your organisation? Install and test is the only sensible approach.

Martin White


The Inquiring Organisation – Chun Wei Choo

Although I claim to be an information scientist in reality I am an information practitioner. Like so many intranet, search and knowledge managers I have to observe closely and then scale up in an effort to find some generic approaches to solving the very complex challenges that organisations face in managing information and knowledge. May be because of my original training as a chemist I am constantly looking for answers to ‘why’ certain approaches to information management seem to work within some form of information culture. Then in 2013 I came across a paper on information culture and organisational effectiveness by Professor Chun Wei Choo of the University of Toronto. In this paper he described result, rule, relationship and risk taking cultures and their impact on organisational effectiveness, and I have used this model many times in the period since its publication. In 2015 a paper was published by Thasi Elaine Vick et al on Information culture and its influences in knowledge creation that built on Professor Choo’s model, which brought together knowledge management and information culture.

Now Professor Choo has published The Inquiring Organisation – How Organisations Acquire Knowledge and Seek Information which sets out the underlying principles of information and knowledge management from the perspective of the epistemology of organisational learning and information seeking. The book commences with a very well structured introduction which it is essential not to skip over – in effect it is a handbook to the book. In Part One the fundamental principles of organizational epistemology are presented, which provide an inclusive approach to the inter-relationship of knowledge and information that is not built on that invidious triangle of data, information and knowledge, topped out with wisdom. As is the case with the entire book there are relatively few case studies but those that are presented are analysed in considerable depth.

Part Two addresses organisational information behaviour. (I reviewed a book on this subject recently). There have been many models of information behaviour, of which Professor Choo selects those by Carol Kuhlthau, Brenda Dervin and Tom Wilson to examine in considerable detail. I cannot emphasis how important I regard an understanding of information behaviours to the delivery of satisfactory information and knowledge management services. There is also a consideration of Robert Taylor’s work on a taxonomy of information use, an approach which I have found very useful in building use cases for intranets. In this section of the book Professor Choo builds on his 2013 paper referred to above and the later paper by Thasi Elaine Vick. He presents an integrated model of organisational information behaviour based on information needs, information seeking and information use. There is also a chapter in internet epistemology that at first did not seem to fit with the rest of the book but several readings later I now understand why it was included.

This book is very well structured, both in the overall journey towards the final chapter on The Inquiring Organisation but also the introduction that sets the scene and thoughtful codas at the end of each chapter than pull together lessons learned ready for the continuance of the journey in the next chapter. There is a very well selected bibliography and a good index.

Professor Choo’s book rewards careful reading, because the evidence he presents and the insights he gives will provide you with an invaluable set of lenses with which to view aspects of information and knowledge management. In much of his writing his initial training in engineering comes through, with a very grounded approach to the analysis ofthe case studies and a sure understanding of how organisations work. In many respects he presents a unifying theory of information and knowledge management, and I would suggest that the KM community would do well to consider what Professor Choo has to say. After all the root of the word ‘epistemology’ is the Greek word epistēmē, meaning “knowledge”. It may take you nine weeks to read and consider the nine chapters but at the end I am certain you will say to yourself “Now I understand”. The benefits to both you personally and to your organisation will be significant and long lasting.

Martin White

 


SharePoint 2016 Search Explained – Agnes Molnar

One of the many mysteries about Microsoft SharePoint is that there have been so few books to guide search managers in how to get the best from the application. There were a few on SP2007 search and two excellent books on SP2010 by Mikael Svenson et al (Microsoft Press) and Mark Bennett et al (Wrox) but both are out of print. As far as I am aware there are no books of equivalent depth on SP2013. I have difficulty in understanding Microsoft’s search strategy. On one hand they have some of the best information retrieval researchers in the world working within Microsoft Research and yet if you want to search the Microsoft Research site then the results interface is terrible. Some features don’t work at all. Agnes Molnar is the author of some of the SP2013 briefing papers published by BAInsight and has rightly gained an excellent reputation for her work on search implementation, especially SharePoint 2013 so I welcome her initiative in writing SharePoint 2016 Search Explained

Of course no sooner do organisations come to understand one release of SharePoint than Microsoft announces the next release. It’s actually worse than that because I know of a number of organisations who have a SP2013 intranet but retain the SP2010 search application because of its exceptional power and performance. With the gradual withdrawal of official support for SP2010 organisations are now looking beyond SP2013 and considering SP2016, Office 365 search and hybrid search. Writing about search is not easy because there are three potential readerships, namely developers/integrators, search managers and business managers. Agnes starts out with a general introduction to enterprise search and then provides a description of the technology behind SharePoint 2016, Delve and Office 365. The book concludes with a short section on content quality and the future direction of SharePoint search.

The transition from SP2010 to SP2013 search has caught many organisations by surprise. So many have told me that they are using FAST Search without any idea of what FAST Search was. In some respects SP2013 search is an advance on SP2010 but in other respects (such as the content processing pipeline) it is not. In particular it is now more optimised for searching SharePoint content and less so for wider enterprise search. This why BAInsight has been so successful over the last few years as it offers technology and development routes that are now not offered by SP2013. What is missing from this book is a consideration of the implications of moving from SP2013 (and indeed SP2010 search on SP2013) to SP2016.  Issues of cloud vs on-premise vs hybrid are for the enterprise architects to consider – search managers just want to be certain about how the user experience can be enhanced.

It’s not until I started to write the second edition of Enterprise Search I realised how much was missing (or was less than accurate!) from the first edition, published in 2012. The second edition, published in 2015 was almost a total rewrite even if some of the chapters looked similar. In fact the second edition took me longer to write than the first edition.  I know Agnes is planning a  second edition of her book now, ready for the wider-scale release and adoption of SP2016. Her insights and experience will be invaluable. Buy the first edition in any case as it will give you a good sense of the opportunities and challenges ahead and watch out for the second edition in due course.

Martin White

 


Re-Imagining Productive Work with Office 365 – Michael Sampson

When I was working on the IMF intranet in 2001 (during 9/11!) I was given a book that was the result of an ethnographic study of how the IMF worked. Ethnography is the study of how people behave in a social setting, such as an office, and ever since that project I find myself looking around offices as I conduct interviews to try to get a sense of how work is being accomplished. The reason for this introduction is that Michael Sampson’s new book is not just a handbook for Office 365 but a handbook for a digital workplace which happens to be using (or planning to use) Office 365. This is an important distinction because even if you do not use Office 365 this book provides a specification for all the work elements that you need to support in whatever platform you are adopting.

Michael’s books always have a structure to them, and each chapter has sections on The Big Idea, Research Findings, the Office 365 Capability, Analysis and Evaluation, What Firms Are Doing, Behavioural Aspects and On Improving Performance. After the introductory chapters the topics covered are

  • Storing and Sharing Files
  • Profiling Employee Expertise
  • Co-Authoring Documents
  • Managing Meetings
  • Holding Discussions
  • Running Team Projects
  • Thinking Productively

The section on research findings is important because there are many lessons to be learned from well-conducted surveys and from academic research. Most practitioners ignore this wealth of knowledge but Michael presents it in a way that the implications for a digital workplace manager are clear and helpful. This book is not a ‘quick read’ and certainly does not set out to be a populist “101 on Office 365”. Some authors make me feel that they are talking to me; with Michael I feel that he is alongside me guiding me through the forest of digital working to show where Office 365 offers good solutions, and also where it is currently lacking in functionality. Rather like a tour guide around a new city! It has taken me a while to write this review just because I have been working through it slowly (very unusual for me) and adding digital comments to the digital text as I went along.

As with all of Michael’s books the production quality for this self-published book is at a level that the leading commercial publishers would be proud of. The book is presented in landscape format which works well when text and screen shots have to meet up. I would like to have seen the comments about where Office 365 does not deliver given a little more highlighting and perhaps a suggestion of a work-around. My own frustrations with Office 365 are the error messages and the latency. I can work more quickly than the server! There is also no reference to search in Office 365 apart from a passing reference to Delve. There is a particular issue with people search, but that’s a long story and you can read more about it in Enterprise Search.

The single user price is $19, which is less than a couple of coffees and two nice cakes! If you look at the comments from other readers you will see that most of them focus on the benefits to Office 365 users. But this is far more than the Unofficial Handbook for Office 365. If you have any plans or even pretensions of creating a digital workplace then you need this book. Jane McConnell will guide you on strategy, Michael shows you how to put the strategy into action using Office 365 as an example platform.

Above all this book will make you think about what the core working patterns are in your organisation. Without this understanding any digital platform will fail to support productive work. Sometimes you may even have to change the working patterns to get the best from the technology so that overall the organisation benefits from the investment. You will certainly benefit from investing in this book.

Martin White