Category: News

ISC Creates Cloud Navigator to Focus on Delivering Microsoft Cloud Services

Designed to take advantage of the explosion of cloud services

Mark Alexander to become President

Tallahassee, Florida, August 14, 2018.

ISC, a Microsoft Cloud Solution Provider (CSP) and Gold Competency Partner, today announced that its Board of Directors has approved a plan to create Cloud Navigator, a company brand focused on delivering and managing cloud-based Microsoft services to public sector, non-profit and enterprise customers, effective immediately.

“This step will help us leverage the exponential growth of the cloud to provide our customers with a competitive advantage and improve our ability to integrate cloud services into our customers’ day to day operations, seamlessly and intuitively,” said Mark Alexander, President of Cloud Navigator.

"We are enabling our customers to operate as a ‘Modern Enterprise’ by delivering & integrating Microsoft cloud-based services in the following practice areas:

Employee Productivity and Collaboration
Enterprise Security and Compliance
Business & Customer Application Innovation and Modernization

Infrastructure, Platform and Software as a Service (IaaS, PaaS & SaaS)

By drilling into each of these practice areas, we will be able to integrate our customers’ business goals and objectives into scalable, flexible solutions that will enhance their ability to deliver products and services more effectively to their constituent and customers”.

As a Microsoft Cloud Solution Provider for commercial and government accounts, Cloud Navigator bundles cloud onboarding and management services with cloud platform subscriptions using a pay for what you use model. To help operate the new division, Kristal Middlebrook has been promoted to Chief Operating Officer and Greg Dodge has been promoted to Chief Technology Officer. Edwin Lott will remain as the Managing Partner of ISC.
Cloud Navigator, based in Tallahassee, Florida, is a Microsoft Gold Certified Partner and delivers and manages Microsoft solutions to the public and private sector, helping them become the “Modern Enterprise”. They have helped hundreds of clients adopt, operate and maintain cloud-based solutions that help them deliver products and services more effectively.
Avoiding disaster at home and at work

Avoiding disaster at home and at work

In the wake of Hurricane Irma, I’m taking some time to reflect on my personal and business disaster preparedness plans and outcomes.  Living in Florida, we learn to expect a tropical storm or hurricane to threaten our homes and businesses each year.  Most years, we dodge the worst of it, but usually there is some impact.

Here I sit in my office two days after the storm has passed over us and the power is still out at my house a couple of miles away.  I live on a street that is still serviced by power lines on a canopy road with a lot of majestic live oak trees that frequently interfere with the electricity.  I have a portable generator but it is loud and difficult to keep it running for multiple days–so the food in our refrigerator and freezer is going to get tossed out again.  We don’t have running water either.Hurricane Irma track

There was a time not long ago that our company’s internal IT systems were prone to fail in some way and it wasn’t just during storms.  We suffered outages to email, application and data servers whenever the power went out, and sometimes, it was TRICKY to get them back up and running.  I can think back to one time after a storm that a database server had a disk drive failure when we tried to bring it back online and we lost a couple of days getting it fixed.

Things are different today.  We have no servers at either of our office buildings, just a little bit of network hardware.  All of our systems are in The Cloud.  We rely on the built in redundancy and fault tolerance offered by Microsoft’s cloud platforms.  These systems are designed to expect failures and outages and remain operational and protected.  Disk drive failures probably happen frequently, but we never even know about it and there is absolutely no impact to operations.

If a data center where one of our systems is deployed were to be destroyed, we of course would have some down time, but we know that all of our data, apps and servers are safely replicated to other data centers and ready to be brought online if necessary.

The day before Hurricane Irma hit us, we put a few sandbags along the front door to the office building.  The double glass doors have leaked water once or twice during heavy wind and rain.  That was the extent of our disaster preparedness efforts as the storm approached.

Over the past year, we’ve learned how self-hosted on premise IT systems can be protected in The Cloud, and we’ve been rolling out that capability to our customers–so you don’t have to go “all in” on cloud computing to enjoy some of its benefits.

One of the most important things I’ve learned myself is that it doesn’t have to be difficult to have peace of mind and a high confidence that stuff is going to keep on working.  It can be easy, inexpensive, and rapidly implemented.  All good!

its all good

Azure Site Recovery – Disaster Recovery Made Easy

Azure Site Recovery – Disaster Recovery Made Easy

Experiences with Azure Site Recovery

As a provider of cloud solutions, our earliest use of Azure Site Recovery (ASR) wasn’t protection at all but rather migration.  We used ASR to replicate small sets of servers from customer premises to the cloud.  For customers running Hyper-V or VMWare with supported server images, ASR makes migration to the cloud almost trivially easy.

Strategies for Delivering Disaster Recovery

After our success with this limited use of ASR, we were interested in more challenging engagements.  We wanted to use ASR to deliver Disaster Recovery as a Service to customers.  Our partners at Microsoft were happy to help us plan our strategy.  For example, they told us that other business partners were focusing on the onboarding phase of a typical ASR engagement.  As a result, we designed offerings where onboarding and ongoing monitoring and management are individual options. For ongoing services we opted to give our customers a choice between monitoring their recovery solution themselves or paying us a modest fee to do so.

An Example DR Engagement

This strategy proved fortuitous.  One customer, Florida Surplus Lines Service Office, had a budget for the initial work but wanted to keep ongoing expenses down by doing all monitoring themselves.  We bid the onboarding a bit lower than initially planned.  That’s a decision I would second-guess myself on occasionally during the next few weeks.  In the end, even if our effective hourly rate was a bit less than we would have preferred, it was a valuable opportunity to learn and we delivered a solid solution.

A Successful Outcome

We did all the configuration on the Azure side, and worked closely with the customer on activities which had to be done on-premise.  Critical on-premise tasks were to run the deployment planner; set up a VPN gateway between local and Azure networks; set up a Configuration server for their VMWare environment; and update a critical Oracle/Linux server to a version supported by ASR.

Based on the output from the deployment planner, we divided their servers into three batches to initiate protection.  Each group took a day or two to reach protected status.  On the Azure side, we had two networks set up- a VPN-joined network hosting a secondary domain controller, and an isolated network for test failovers.  Our very first test was to fail over a domain controller to the isolated network.  We then promoted it to primary to have domain services available on that network.

Next we did a test failover of all protected infrastructure to the test network.   Test failovers do not impact protected workloads and can be used for non-disruptive DR readiness testing.  The customer confirmed that interactions between different parts of the failed-over infrastructure performed correctly.

Our final test was a true fail over and fail back of a test machine on their production network.  Their servers communicated effectively across a VPN gateway, and the failed-over server retained changes after failback.

At this point we and our customer were satisfied that their servers were properly protected.  Although they opted to monitor the solution themselves, we keep an engineer on their alert notification list.  We review the notifications from time to time, and as Microsoft continues to improve their monitoring tools we plan to keep them updated on features and practices that may be of use to them.

Join Entities in Dynamics 365 and Why they are awesome


If you have ever worked with many-to-many (N:N) relationships in Dynamics 365 (the product formerly known as Dynamics CRM), you may have at some point created a N:N relationship between entities.  It is a useful relationship type for sure, but it has some serious ‘out of the box’ limitations.


The main issue I have always had with it is the complete lack of capability to execute a workflow process when the relationship is created, or any audit record of who and when the relationship was created or modified.

Example 1:

Contact has a N:N relationship between itself and a custom entity called ‘Web Roles’.  You assign new Web Roles to the Contact record to allow them access to pages on a custom portal.  But you need to know who added the role, and when the role was added.  Say you have delegated the web role assignment to customers that have an admin role to manage their own users on the portal?  How would you know who added the role and when?

Example 2:

Contact has a N:N relationship to Account.  For each Contact, they have a regular parental N:1 relationship to an Account, but they might also have a relationship to several other Accounts.  Perhaps they are a distributor of your products and have a company that they work for, but also work with several of your other Accounts to sell them products.  And each Contact may have a different role that they have in relationship to the other Accounts.

But Wait??

If you have worked with Dynamics 365 for any amount of time, you might be thinking “Hey, you can use the built-in Connections entity for this”. And you would be correct.  But only if you are only going to use Connections for just one type of N:N relationship.  Since Connections are basically available for ANY record to be linked to any other record, it’s a lot more generic than it needs to be.



The solution is what I like to call a ‘Join Entity’.  If you have ever done traditional database or application development that worked directly with a SQL database, you should already be familiar with this concept.  It’s basically a table that sits between two other tables and stores the primary key for records in each table that require a join.



In Dynamics 365 parlance, we create a Join Entity that works just like a join table.

Step 1: Create a new Custom Entity

When you create this entity, give a good name that reflects what you are joining.  In this example, we are going to create a join between Account and Contact to allow for multiple Contacts to be associated with multiple Accounts.  I’m going to call this one Account To Contact.


For the Ownership option, this one is up to you.  In most scenarios, it is safe to set this to Organization since we are just using this for joining up other entities and we don’t need all the overhead associated with User or Team ownership.  If the relationship needs it and perhaps some user or team needs to own the relationship then by all means, set it that way.


Most likely you won’t need any of the Communication & Collaboration options enabled, and you can always enable most of them later anyway.  As a rule, I like to keep them all OFF until I know I need them.


For Data Services options, I would Allow Quick Create, and Enable Auditing.


For Primary Field, you can leave the default name as Name, or give it something else more appropriate if you prefer.  We’ll talk about how to deal with this field in a later step.


IMPORTANT: You need to set the Field Requirement for the Primary Field to Optional at this step.  If you forget, don’t worry it can be changed by editing the field directly, but it’s best to do it at this time.


Side Note: For the color setting, I like to set all my custom entities color to plain white (#ffffff) and then get some nice flat black icons from Icons 8 (  It’s an awesome site with thousands of icons and I highly recommend it.


Click Save to create your new entity.



Step 2: Add the Relationship Lookups

Now we need to add the appropriate N:1 lookups to the entities we are trying to join on.
Click the Fields option and then click New in the toolbar.


Display Name should be something that makes sense to the user that will be adding this new record/relationship.  For our example, we are joining to Account so we will call it Account.


Tip: This will also default the Name field to Account, but it makes sense to add a suffix of Id to this SDK name field.  This will be helpful later if you have to write any JavaScript or .NET code to reference the field. You should be able to quickly recognize it as a GUID to lookup to another entity.


Field Requirement should be set to Business Required to avoid any orphaned join records.


Select a Data Type of Lookup and a Target Record Type of Account.



  • Display Name = Contact
  • Name = new_ContactId
  • Field Requirement = Business Required
  • Data Type = Lookup
  • Target Record Type = Contact

Step 3: Edit Default Form

Click Forms in the left navigation to review the list of built in system forms.  Click the Information form listed first with a Form Type of Main.


Customize this form and add your newly created Account and Contact fields.  Click Save and Close to save your form changes.


Tip: Since you only get one section by default, the Name (and possibly Owner) field(s) will fill up the entire width of the form.  I find this annoying and completely impractical, but that’s just me.  I will usually edit the General Tab and set the Formatting option to use Two or Three Columns.



Step 4: Create Quick Create Form

While still looking at the list of Forms, click New on the toolbar and select Quick Create Form


Modify the form by adding your 2 lookup fields to the form. Save and Close the new form.


Optional:  Probably a good idea to go ahead and publish your changes now if you haven’t already done so.



Tip: If you know what Source entity will be used a majority of the time, put the other entity lookup first on the form.  In this case, we are assuming that contacts will be added from the Account record, so we are showing the Contact lookup field first.  The Account field will already be filled in when the join record is quick created.


Step 5: Build a workflow to set the Name Field

Now that we have the basics setup, we need to set a Name for these new records.  This is the name that will show in any lookups to this entity, which the system uses by default.  We’ll build some views that will be used on forms, but we can’t just leave this field blank.  This workflow will set the Name to a combination of the Account and Contact names.


Note: Recall that in Step 1 we set the primary field requirement to Optional.  If you did not do that, now is the time.


Navigate in your Solution (or All Customizations) to Processes and click New.
  • Process Name: This should reflect the Entity Name, RT for ‘Real-Time’, and some description.  I like to call this one New to show that it runs when a new record is created.
  • Category = Workflow
  • Entity = Account To Contact
  • Run this workflow in the background should be Unchecked/Off



Click OK to create the new process.
Options for Automatic Processes
  • Scope = Organization
  • Start When =
    • After Record is Created
    • After Record Fields Change
      • Select Account and Contact (the custom join lookup fields added earlier)
  • In the logic area, add a step to Update Record
  • Click the Set Properties button next to the Update Record.
  • Click in the Name Field an add the dynamic values of Account and Contact.


Tip: Put some kind of delimiter like a dash, asterisk, or colons to separate the values.



Note:  This can be whatever dynamic values you like, but try to make it unique to this join record.  Remember that this will show up in all the lookups that may reference this join entity.

  • Save and Close the Properties
  • Activate the Workflow Process

Step 6: Customize a Form View

We are going to want to add a sub-grid to both entities we are joining to, so we need a view that shows the values.


At this point we haven’t modified any of the default views, but we will at least modify the Active Account to Contact view, then do a save as and modify that one for a form view.


  • Navigate in your solution (or All Customizations) to the Account to Contact entity.
  • Select Views from the left navigation tree
  • Select to edit the Active Account to Contact view
  • Add the Account and Contact lookups columns to the view, and order them as you prefer.
  • Save the view, but DO NOT CLOSE the window
  • Click Save As and enter Form View for the new view name
  • Remove the Name column from the view
  • Move the Contact name and the first column.Note: This is because we will use this view on the Account Form, so it makes sense for the Contacts to be listed first.



Option: If you like, you can do another Save As, call it Contact Form View, and set the Account as the first column.


Step 7: Add it as a sub-grid to a form

Now that we have our entity, we set the fields correctly, and we built a form view, we can put this all together on the Account form and see how it works!


  • Navigate in your solution (or All Customizations) to the Account entity.
  • Select Forms from the left navigation tree
  • Select the Account form of Form Type Main from the view
  • Scroll to the appropriate spot on the form where you want to display the new entity
  • Insert a Section to contain the Sub-grid
  • With the Section selected, insert a Sub-Grid
The key fields here are in the Data Source area.
  • Records = Only Related Records
  • Entity = Account to Contact (Account)
  • Default View = Form View



Note:  If/when you add this to Contact, select Contact Form View instead of Form View.
Save, Publish, and Save and Close the form changes.


Step 8: Test it!

Tip: If you haven’t recently done so, now is a good time to Publish All Customizations.  We’ve made a lot of changes so we want to be sure everything shows up for testing.
  • Navigate to an Account record
  • Scroll down to where you added the sub-grid and it should look something like this


To add a new Contact to relate to this account, click the + button.  You should see the Quick Create form we created earlier.



Note: If you don’t see this + button, or you get the regular entity form instead of the quick create form, you missed a step somewhere in setting up the entity and lookups. Check the following:
  • Does the Account to Contact Entity set to Allow Quick Create
  • Are the Account and Contact lookup fields set to Business Required
Enter a Contact name, and click save.  You should now see the new relationship added!


Step 9: Extend it!

What we’ve done so far has really been pretty much what you get out of the box for N:N relationships, with the exception of step 8 where we added the new relationship with a quick create form.  Adding those with the out of the box functionality is painful (just my opinion…).  Now we can get to extending this new entity to do something that the out of the box functionality does not.


For our example, we are going to add a Role attribute to the entity.
  • Navigate in your solution (or All Customizations) to the Account to Contact entity.
  • Select Fields from the left navigation tree
  • Click New to add a new Field
  • Display Name: Contact Role
  • Data Type: Option Set
    • Add Options for Role 1, 2,3,4
Note:  This is just for example purposes, you can add any field type as required by your needs.


Save and Close the new field



  • Select Forms from the left navigation
  • Select the Quick Create form type
  • Drag the new Contact Role field on to the form
  • Save and Close
  • Also edit the main Information form and add the Contact Role field, Save and Close
  • Select Views from the left navigation
  • Select the Form View(s)
  • Add the new Contact Role field to the view(s)
  • Save and Close
  • Publish your changes
Now when you view the data on your Account entity, you’ll see the new Role value assigned to each record.



We have covered a lot of different topics, but you should have a good grasp of the power of using this Join Entity concept instead of the out of the box many to many (N:N) relationship.  You gain auditing capability to see who, what, when.  You gain the ability to extend the relationship and add additional fields to define the relationship (e.g. roles, etc.).
You can also extend the entity to run workflow processes when records are created or modified.  We did a simple name update, but this could perform many other tasks if needed (e.g. send an email, update a related record, etc.)



A Taxonomy of Microsoft Security Services

A Taxonomy of Microsoft Security Services

I was having difficulty keeping up with all of the Microsoft security related products, services, features and nomenclature.  So I started this taxonomy.  What I found is that there can be multiple “product” names or brand names that apply across the same technology set.  It can get confusing.  This listing might be helpful in certain cases.  It has certainly helped me get my mind around what Microsoft has to offer.7 ways whitepaper

Fortunately, it turns out that it is not so difficult to match the right set of security services to your situation and need.  We do it all the time with customers.  It may just be easier than sorting out all of these names!

  • Azure Rights Management(ARM)
    • Policies and encryption
    • Includes:
      • Information Rights Management (IRM)
        • Document, library and message policy based data loss protection
      • Office 365 Message Encryption (OME)
        • Protected sharing via email and OneDrive
  • Azure Information Protection (AIP)
    • A broader label and product packaging over ARM
  • Azure Advanced Threat Analytics (ATA)
    • On premise solution
    • Uses Azure Machine Learning to adapt
  • Azure IaaS Security
    • Network Security Groups, VPN Gateway
    • Azure Storage Service Encryption
    • Azure Disk Encryption
    • Web Application Firewalls
    • Azure Monitor
    • User Defined Routes
    • Network Watcher
    • Azure Storage Account Keys
    • The following have an impact on security:
      • Azure Traffic Manager, Application Proxy
      • Azure Storage Analytics
      • Azure Backup and ASR
      • Remote Desktop Gateway
      • Azure Dev/Test Labs
    • Azure PaaS Security
      • Azure SQL Transparent data Encryption
      • Firewall, Connection Encryption
    • Azure Security Center
      • Monitoring of Azure resources
      • Full monitoring, threat detection, policy based platform for security in Azure
      • Application Whitelisting
      • Just-in-Time Network Access to VM’s
      • Machine Learning for Brute force detection and Outbound DDoS
      • Azure SQL Database Threat Detection
      • Integration with Partners : Fortinet, Cisco
    • OMS
      • Log reporting and alerting
      • Can collect Azure resource logs as well as on premise logs when connected to SCCM
      • Security & Compliance Solution
        • Security Compliance Manager
      • Update and Change Management
      • Antimalware Assessment
      • Active Directory and SQL Health Analysis
    • Azure Active Directory
      • Premium
      • B2C
      • Domain Services
      • Multi-factor Authentication (MFA)
    • Azure Key Vault
      • Hardware Security Models
      • SIEMS Export
    • Enterprise Mobility + Security (EM+S) (aka Enterprise Mobility Suite (EMS))
      • InTune for mobile device management
      • Azure Rights Management Services
      • Advanced Threat Analytics
      • Azure AD Premium
      • Remote Desktop Services
    • Office 365
      • Advanced Threat Protection
      • Security & Privacy Settings
        • Password policy
        • Customer Lockbox
        • Sharing
        • Self-service password reset
      • Security & Compliance
        • Cloud App Security (aka Advanced Security Management)
        • Threat management
        • Data Loss Prevention
        • Data Governance
        • Search & Investigation
        • Service Assurance/Compliance Reports

Using OMS and ASC for Threat Detection

Have you ever heard the phrase “Shoemaker’s kids go barefoot” or “Mechanic’s car never runs”? Well you can add a new one “IT consultant’s labs are insecure”. As a 25-year veteran of the IT industry I’m very familiar with limiting access and reducing the attack vectors for internet connected devices. I do this for customer’s ever day and have gotten pretty good at making sure bad actors cannot break into the systems I design and setup. As any good IT consultant, I also have access to a shared lab of servers to act as a sandbox for testing and understanding deployment scenarios. For convenience, the consultants at our company need to be able to access that lab from anywhere, anytime from any device. The lab is not something that contains anything of value or any customer data, so my thinking was to open it up for “convenience”. The lab has been running for many years with no issues, it was originally setup early on in Azure using ASM IaaS VMs.

Fast forward to May 2017 when I am attending the Azure Architect Bootcamp, 5 days packed full of more information than anyone should legally be allowed to consume. During the presentations, I was intrigued by the Capabilities of OMS and the analytics it captures. I was following along with the presenter for OMS when I noticed in service map that there were lots of “Terminal Services” connections to one of our lab machines from numerous external IP addresses that were not from our offices. The VMs were implemented with a classic Network Security Group that was allowing any-2-any connections over port 3389 to the machines. As soon as I saw the connections in OMS service map, which I had deployed in my lab the day earlier, I suspected a port scan or some type of intruder.

Screen shot

The next presenter started the presentation on Azure Security Center (ASC) so I switched over to Security Center and that is when I noticed this……… NSGs missing on subnets and VMs. The highlighted machines are production machines that are locked down separately, but all the others that start with “ISC365” are lab machines in this subscription and are rarely logged into.

Azure portal

I immediately logged into the lab system ISC365-AP1 to view the security event logs, and low and behold I was actively being attacked as every few seconds from an active connection guessing usernames and passwords.  This was some sort of password guessing bot using a database of well-known passwords, and even though we use strong passwords on the administrative accounts there is a chance some of the test user accounts could have known passwords.  Notice the number of security events in over 200 thousand entries so they had been doing this for a while.

Log Entries

I then went back to ASC to have it implement a NSG on the VNets to only allow RDP traffic from our offices. The time was 5:10EST and within three minutes the attack stopped dead in its tracks. As you can see below nothing happened after that time and I was relieved.

Audit failure

After a refresh of the screen ASC reported it as resolved from the actions it tool. I am impressed as how well it detected and remediated this, and that is hard to do.

Resolved screen shot

As IT professionals, we are asked to do so much with so little for so long that management thinks we can do anything with nothing forever. Project deadlines and business drivers are asking us to do more and more every day, but most companies don’t invest in threat detection or remediation software until there is an “event”.  For Azure, it’s baked into the platform and can be implemented in a way that secures resources by default and audits those resources over time. For all of those IT professionals out there who are apprehensive of using Azure because of security concerns, I say to you that Azure gives you the tools you need to implement practical security measures. Given their deep pockets and laser focus on security I believe Microsoft will move Azure to be more secure that any on-premise implementation even if the consultants miss something.

Related articles:

Security and Compliance

Information Protection


FDLE CJIS Audit of Azure and Office 365 Completed

FDLE CJIS Audit of Azure and Office 365 Completed

Last week, Microsoft announced ( that a Florida Department of Law Enforcement audit of Microsoft Cloud platforms regarding compliance with the FBI Criminal Justice Services Security Policy was completed.  Criminal Justice Information Services (CJIS)

The ramifications of the announcement are somewhat unclear, but reading between the lines, this is fantastic news!  I think it means that soon, State of Florida agencies will be able to consider Azure and Office 365 when tracking CJIS regulated law enforcement data.

Microsoft Inspire

Microsoft Inspire

Another Microsoft partner convention is wrapped up.  There weren’t a gazillion new product updates this year, which is fine with me.  The fire hose runs pretty steadily now.  I appreciate the consistency and refinement of services that is taking place.

This year, we were able to establish some new partnerships that we are hoping to formalize in the near future.

We also moved closer to formalize our partnership with Aycron ( and hope to be offering customers their incredible mobile apps under our rapid deployment and pay as you go model.

I had a great time catching up with acquaintances and friends, learning some cool new stuff, and sharing our plans with as many people as I could!