top of page

GENERATIVE AI: COPILOT DEPLOYMENT FIRST STEPS


Audience: Project manager, Solution designer, Information manager and Business user


In this article we look at the first steps necessary when getting started with deploying Microsoft 365 Copilot, and why that really means you actually need to look at your content, its storage and your access to it - to avoid unnecessary issues.


logo icons for copilot, 365, sharepoint and purview

You can't get the best out of Copilot, without Microsoft 365 apps, SharePoint and Microsoft Purview.


A 'proof of value' helps reduce your risk in deployment

In previous articles we identified Microsoft 365 Copilot as a tool for scaling and accelerating personal productivity, whereas Artificial Intelligence, Machine Learning and Automation have a much broader impact for organisations with innovation, streamlining processes and massive improvement on scale and quality with tasks like pattern analysis, diagnostics, trending etc.


With adoption programme evaluating Generative AI in form of Microsoft 365 Copilot we are no longer looking at "Proof of concept" as we might have assessed application development. Microsoft's singular focus on enabling the infrastructure and plumbing for the services to work across your environment has been extensive (and running for years). What organisations need to do is a Proof of value for the organisation.


What do we mean? The generative AI services in Microsoft 365 Copilot are nuanced and tuned based on the application or content being leveraged. Microsoft has adapted Indexing, Identity, Authentication, Storage mgmt... across its core repositories of SharePoint / OneDrive, Exchange, Microsoft Teams, and Azure (Entra).


What this means is you are really addressing the People, Policy and Process sides of the equation in the adoption model. Most of the heavy lifting in the Technology area has been done. You are not testing the infrastructure. You are identifying if the scope and application are enough for your specific needs.


The deployment first steps checklist

If you've made it through the torrent of spam, f.u.d., marketecture and just plain noise on the web to get as far as evaluating the tools then we have to assume you've been diligent enough to undertake the thinking around the Ethics of use in your context.


Assuming that this works out ok, and you didn't notice things like Grammarly are already embedded with your people's general tools, then for Microsoft 365 Copilot the following things should be your foundations in deploying this toolset.


Where is your content?

Well, if you're going to use Microsoft 365 Copilot, then first thing is - is your content in SharePoint, OneDrive and Exchange?

generated image with person, filing cabinet and kangaroo

Why? Well Microsoft 365 Copilot needs to leverage built-in search indexing, graph api's, identity/auth services and content management services for results. So, you are going to have to move it....


  1. If you've got (lots) of documents or files on fileserver, and you want to see if there's something useful for future, then it needs to be migrated to SharePoint - Not into OneDrive.


  2. If your people still have personal documents all over their desktop, laptop folders, Home drive (H:\) or similar, not in OneDrive for Business, then you need to migrate that content too.


  3. External content sources - Microsoft has a connector framework, and there are integration services - but for a Proof of Value for general users its probably not worth planning the integration investment required... Yet.


Who can/is accessing and using your content?

Are you using a reasonable facsimile of role-based access to your content?


  1. If you are using roles, groups and permissions to good effect in SharePoint and Microsoft Teams, then you at a good starting point. You should still run access reports over sensitive areas.


  2. If not using security groups, site groups and permissions (read: permission levels) and your content is in shared Microsoft Teams / SharePoint sites, libraries or OneDrive then you need to do an access audit and lock-down the sensitive information before you begin - and all of no. 3


  3. Not sure, ask yourself - are you happy exposing employee or client data? No? Your pre-step is a site-audit, user survey and review the SharePoint and Microsoft Teams reports e.g.

    1. the sharing links report

    2. the everyone except external user access report

    3. channel management reports


Given one of the top ethical considerations with Generative AI is respecting privacy and sharing (from inappropriate levels of access), then your big topic of the day is Identity and Authentication.


Use a Proof of Value

With a Proof of Value, not only will you find out really quickly if not, but you will identify why the fidelity and usefulness of output varies so much when your access and permissions are not well defined. Additionally, any testing exercise must have safe external users involved to help you determine just how exposed your content really is.


Basically, you are stress-testing:

  • managing the identity of your people

  • roles and access they have to your content

  • exposure to external parties

  • trustworthiness and honesty in testing feedback


You are going to uncover privacy issues. It's how you deal with them that is important


Can you identify your content (with a degree of accuracy)?

Nobody likes being told to clean-up and put stuff away, but you know what - that's what makes the Copilot generative AI work better - when it can find the content and be sure its presenting the results based on your organisations rules. This is where curation, classification and categorisation are actually your friend in getting the best results and experience from Microsoft 365 Copilot generative AI services.


If you've moved your content to SharePoint Online (or worst just to people's OneDrive), then you might have started with at least identifying your policies, procedures, forms etc, probably have separated out your personnel files, contracts etc? No?


You need to think about starting with:


  1. Basic segmentation - enable Copilot use and results by being clear where you put content

  2. Basic labelling - enables filtering and sharing control, by injecting labels on to key types of content

  3. Basic classification - allows for automation of privacy controls, access and item permissions

  4. Rules-based automation - enable the clean-up of old (or unnecessary) content from active areas


There are a lot more things you can do to get better results but start at #1 and work forward.


Have you bought the licenses and enabled it?

Microsoft (of course) starts you here, which is the worst-place if you haven't done the thinking for your roadmap, maturity assessment or content preparation.


If you have your licenses enabling is 'easy as', but you will need to look at whether Microsoft Teams is a critical part of what you think you are after - because this requires an additional Microsoft Teams Premium license to get all the functionality.

Screenshot highlighting what are currently most important areas for configuration

The 'ah nuts' moment

screenshot of reporting on pilot group of users

There is always the 'ah nuts' moment with any review, when you realise the out-of-the-box experience has a glaringly big hole. From Microsoft 365 Copilot this is the reporting, which is frankly shonky.


For full deployment you will need to spare time, resources and some hard thinking on what you want to see because all you get from Usage - Microsoft 365 admin center is:


  • no. of licenses available to be assigned

  • no. of users with licenses

  • which app they used Copilot in


...and that's it. Nothing on feedback of successful (or abandoned) prompts, number of refinements, ratings on responses (good / bad), summary of comments, reports to admin of privacy issues, feedback on appropriateness - in fact nothing on training approach etc.


For this you need to use Copilot Studio and create | build your own internal language model and associated "Copilot" (read: chat bot) interface.


I would argue that actually what you want and where the value is, is in your own internal and curated content and therefore Copilot Studio will be where ongoing investment is needed for increasing return from Generative AI within your organisation - but that's another article.


Best results from your proof-of-value testing

The key to success will be how you identify and run your proof-of-value, to ensure you are getting useful return to ensure safe and valuable adoption of Microsoft 365 Copilot (or any Generative AI toolset). If you've addressed those first steps in deployment, it should lead you to:


  1. value identification - roles and processes that will benefit

  2. training identification - change management, not just content

  3. personnel development - process change

  4. content issue identification and clean-up

  5. access issue identification and clean-up

  6. acceptable uses and guidance


Using your evaluation criteria to assess if Microsoft 365 Copilot will be right for you is down to your specific situation, but if you do then next step is turning this output into your adoption key tasks.


Assessment

Microsoft 365 Copilot generative AI tools were used to refine the wording for publication of this article. The content was created using the Spoke internal governance framework, standard operational models coupled with consultant experiences from last 12-months of hands-on use of Microsoft 365 Copilot.


Topic generation, ideation for content and practical evaluation of the content presented was entirely the author's output. Generative AI minimised the effort required for wording refinement and image generation:


  1. Article text: 2-hours work (I write a lot, its easy)

  2. Image generation: 2 hours with Microsoft Designer and screenshots

  3. Summary quality and accuracy review: 1 experienced person, 1/2 hour.


Result: Validated, real-world content based on personal knowledge and experiences.


Observation

Generative AI is useful and helped speed-up capture of ideas, refine the out-put and presentation, but the intellectual heavy lifting is still down to the individual. The individual still has to start the process, curate the output, and approve the result before publishing.


Accountability for errors in content, presentation etc are always with the author.


Want to know what we know? Give us a call!

Looking for guidance in adopting Generative AI in a robust and useful manner? Or interested in learning how to adopt Generative AI into day-to-day office productivity, or even just learn some of the tricks-of-the-trade? Email us at hi@timewespoke.com


About the author: Jonathan Stuckey

Comments


bottom of page