[et_pb_section fb_built="1" admin_label="section" _builder_version="3.22"][et_pb_row admin_label="row" _builder_version="4.1" background_size="initial" background_position="top_left" background_repeat="repeat" hover_enabled="0"][et_pb_column type="4_4" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="4.1" hover_enabled="0"]

We are currently seeing a major change in the IT industry, and the cloud environment has turned into a safe haven. With a variety of major businesses switching to the public cloud, we have also seen a huge drive towards scaling, and automation. Generally, both the technology and financial sectors are facing a huge shift from having complex computing tasks done manually towards more streamlined and automated processes.

When going for cloud adoption, “businesses are pushing PaaS first and that has a lot of positive effects. To begin with, things are made easier right off the bat, because essentially all the building blocks are there, so any initial business can run workloads and get going within a day or two, rather than wasting too much time to set up the foundation,” stated Hentsu CEO, Marko Djukic.

An increasing amount of businesses are implementing grid computing clusters to handle massive workloads.

What is Grid Computing?

Grid Computing refers to making use of the shared power of a cluster of computers to process computationally intensive tasks that would otherwise bog down a single workstation. To put it simply, jobs can be submitted from one computer to the grid, which then processes the data and returns the output to the user. Importantly, multiple people can simultaneously make use of the same grid by intelligently managing how the computers in the grid allocate resources. This allows for significantly improved workflow for companies whose work is optimized for grid computing.

Opening with a much needed refresher on public and private cloud, as well as defining key terminology, the talk flowed into an investigation of the challenges companies face when working with different data sets and a look at solutions currently available to companies.

How Grid Computing Works - Grid Architecture Explained

To recap, grid computing denotes one main computer that distributes information and tasks across multiple networked computers – all of that usually towards one objective. Let us simplify a bit and illustrate how things operate. Grid computing network frequently has three types of machines:

  • Central Control Node - Server/computer or a group of servers working to controls the entire network and preserving the account of the resources within the network pool. Again this part is used for controlling and not for processing.
  • Provider - A computer working to give resources within the network.
  • User/Client – In basics terms, clients or users can actually utilize the computer on the network regardless of geographical location.

Key Advantages of Grid Computing

One of the key aspects of grid computing is flexibility, and more importantly, computing power. In other words, it boils down to having a single computer grid for large amounts of data rather than placing the demand on a single supercomputer.

Here are some of the most frequently asked questions related to grid computing.

Why grid is computing important?
What are the real benefits and biggest advantages of grid computing?

Well, the reasons why so many businesses rely on this particular method of completing joint tasks because it denotes following key advantages:

  • Improved use of existing hardware.
  • General performance increase and quicker handling of complex problems.
  • Easier to collaborate with other organizations.
  • Massive servers for applications can be divided into smaller commodity type servers.
  • With grid environments lead to reduced chances of failure - if one desktop/server fails, other resources pick up the workload.
  • Jobs can be executed in parallel speeding performance.

The Agility of Ephemeral Computing

It has to be highlighted that ephemeral computing has also been a huge part of the innovation process in modern-day tech. It carries tremendous advantages, albeit we have to ask the simple and most obvious question here: what does that mean for businesses and the SaaS industry in particular?

When you describe a process as “ephemeral” it denotes something temporary and brief. Essentially, the notion of dealing with a surplus of servers or indeed a shortage of servers is something that’s not an issue with ephemeral computing. To put it into perspective, ephemeral computing services are agile and will adjust according to the problems and needs at hand.

The Power of Code and Automation

Utilizing ephemeral clusters that scale up and down as needed is quite a boost to handling workloads in general. In short, it means you limit or eliminate convoluted pre-planning and relying on heavy server power.

“It’s basically all about serverless. Code that is distributed across ephemeral compute that handles the analysis and then churns out the answers without having to deal with what’s actually the underlying compute,” says Marko Djukic.

He added: “Compute is just a utility you consume as needed, nothing exists permanently."

To summarize, in PaaS and SaaS scenarios, giving your business operations and workloads the power to shrink automatically, can completely remove any scalability issues you may be experiencing with traditional server-heavy computing methodology.

Traditional tools and PaaS Offerings

There is a breadth of choices for grid computing and how to migrate workloads into cloud environments. We covered some of those, looking at traditional MATLAB setups to more extreme Platform as a Service (PaaS) environments from Google using their Bigquery and Datalab, and ran some live demos ripping though 2TB of full depth market data.

Key points to take away:

  • Horses for courses - What spec’d machine does your task work best in? Many cores in fewer machines, or many smaller cores on multiple machines? Consider how best to configure your cluster.
  • Reduced upfront costs - Unlike traditional Grid Computing clusters that require upfront purchasing of all the machines needed before they can actually be used, cloud solutions let you skip out on paying for hardware.
  • Flexibility - Publicly available solutions can allow you to quickly scale the size and power of your cluster to ensure that you can crunch the data in the time you need.
  • Reducing worker downtime - Having employees sitting around waiting for their code to execute is wasting time and money. By pushing tasks off individual computers and onto the cloud, the bottleneck is alleviated and workers can continue with work.
  • Full Platform as a Service (PaaS) - Allows some very dynamic and fast access to compute across vast data sets, but will usually require significant re-tooling for major hedge fund production environments. When implementing PaaS type grid computing this also involves a more holistic approach across people, processes and technology.
[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="4.1" column_structure="1_2,1_2"][et_pb_column _builder_version="4.1" type="1_2"][et_pb_text _builder_version="4.1" hover_enabled="0"]

How Can Hentsu Help?

We've built up a wealth of in-house expertise running grid computing workloads across all three major public clouds - Amazon AWS, Microsoft Azure and Google Compute Engine. We can get you up and running quickly with pre-tested designs and architectures, greatly eroding the overall traditional pains and TCO for running grid computing.

[/et_pb_text][/et_pb_column][et_pb_column _builder_version="4.1" type="1_2"][et_pb_image _builder_version="4.1" src="https://3bb4f13skpx244ooia2hci0q-wpengine.netdna-ssl.com/wp-content/uploads/2020/06/GeekRadar-Medium.gif" hover_enabled="0"][/et_pb_image][/et_pb_column][/et_pb_row][et_pb_row _builder_version="4.1"][et_pb_column _builder_version="4.1" type="4_4"][et_pb_text _builder_version="4.1" hover_enabled="0"]

Get in touch: hello@hentsu.com, we'd love to hear about your grid computing challenges.

[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]

Date/Time

Date(s) - 01/01/1970
12:00 AM - 12:00 AM

Location

600 5th ave. NY, NY
Incoming news, people. As we march on through the year 2020, an increasing amount of tech companies are focusing on automation and code driven operations. Numerous major shifts are happening in IT, but more importantly, most of it rests on the shoulders of cloud computing. Join us as we dig into the latest scoops from the technology and cloud scene.

Cloud Spending Jumps in 2020, As COVID-19 Increases Demand for Collaboration

Another confirmation has just hit the news scene that the demand for cloud services is on the rise. COVID-19 is still very much an ongoing health crisis all over the globe, and this has triggered the need to instant cloud-based services and tools. Data gathered from a recent research indicates the latest numbers on cloud service usage. Word is: “The big takeaways from the report show an overall (but modest) increase on cloud spending (2.2%) and a major drop (16.3%) on spending for non-cloud infrastructure.” That’s right, the report does point to COVID-19 as the key variable that influenced cloud spending. Additional information was released in the report, saying that remote work has "increased demand for cloud-based consumer and business services driving additional demand for server, storage, and networking infrastructure utilized by cloud service provider datacenters." News via TechRepublic.

Microsoft Boosts UK Cloud Computing with Azure Launches

According to the latest news from MS, fresh cloud services are on the way, including three cybersecurity tools. Also, Microsoft released 10 brand new cloud computing services specifically for their UK Azure regions. The tools are being touted as innovative and utilizing the latest tech to keep data secure. “Microsoft continues to invest in our UK Azure regions to meet the growing needs of our customers. Azure is helping organisations, both large and small, adapt to a new way of working, and our cloud experts continue to help them at this challenging time,” said Michael Wignall, Azure Business Lead at Microsoft UK. News via Official Microsoft Blog.

Amazon Launches Cloud Service to Help Non-coders Build Apps

It must be said that Amazon has been dominated on the cloud infrastructure market, especially in the last few years or so. The industry cloud and e-commerce veteran is upgrading its cloud-related product lineup. The specific tool is called the Honeycode service and is officially described as a tool for people who do not code. Shocking, right? Naturally, this is AWS’s strategic effort to The Honeycode service will compete with offerings from cloud challengers Google and Microsoft. Here’s a snip: “The service is free for up to 20 users and as many 2500 rows of data in a spreadsheet that’s part of the product. AWS will charge based on storage and number of users. Longtime AWS customers Slack and SmugMug are among those planning to use the service, the company said. The service is available today, currently in one AWS region. AWS plans to make it possible to export data from Honeycode, Vaidyanathan.” News via CNBC and AWS.

Date/Time

Date(s) - 01/01/1970
12:00 AM - 12:00 AM

Location

600 5th ave. NY, NY
Join us as we browse through a selection of fresh and exciting news stories. Hentsu's weekly tech news roundup takes a peek at the latest and hottest topics within the current technology scene, the economy, gaming and, of course, the cloud.

Global Economy is Tanking, the Cloud Thrives

COVID-19 may have brought the world to its knees, but people are coping and surviving. Once again, the greatest cloud providers are rising to the occasion as the entire globe hops on the online bandwagon. In short, a majority of businesses are looking more prosperous as they migrate to the public cloud environment. And it’s not the first time we’ve heard this. “Demand from travel and hospitality companies is down, but cloud usage to power gaming, videoconferencing, and remote learning has spiked up. Telehealth provider Amwell, an Amazon cloud customer, has seen video health visits increase more than tenfold due to Covid-19, serving as many as 45,000 per day.” Also, “That new demand has sharpened the appetite for chips, disks, and networking gear—the physical components that power the cloud. Microsoft’s chief financial officer, Amy Hood, told investors last month that the company expects to spend more on cloud computing infrastructure due to increased demand spurred by Covid-19.” News via Wired.

Businesses Set to Spend Big on Cloud this Year, Security Projections Shift

As more and more companies turn to cloud adoption, security spending to is also set to grow 2.4% to reach $123.8 billion in 2020, which is a bit down when compared to last year’s projections. With cloud-based delivery models helping protect a bulk of security in the industry, growth is slower but investable. Have a peek: “Cloud infrastructure was found to be the biggest driver of this rise, with over three-quarters (76 percent) increasing their use of platforms such as Amazon Web Services (AWS), Microsoft Azure….” “Cloud spending has soared in 2020 so far due to the increased need for remote working capabilities brought on by the coronavirus pandemic and subsequent lockdowns, but this trend is set to continue even as workers begin to return to the office, according to a survey by Snow Software.” News via TechRadar UK.

Apple Is Building the Perfect Laptop for Remote Work

The world is changing, the economy is reshaping, and the modern-man's stay-at-home workplace is now the "new normal." As a result, the biggest manufacturers and IT companies are adapting to this situation fast. Apple is no exception. Right now they are devoted to creating and launching the new “perfect laptop.” Yep, tech industry giant is expected to officially make the switch to in-house ARM processor-powered Macs. The announcement is most likely going to occur during WWDC 2020, which is just around the corner. The ARM-based chips “are the same processors that power the iPhone and iPad. Before you wonder why Apple would stick a smartphone processor in a laptop, it's worth mentioning that it wouldn't be the first (Samsung and Microsoft already do) and that the A12Z and A13 chips in Apple's devices are as powerful as the processors in many laptops.” News via Inc.

PlayStation 5 Announced, Featuring Cloud Functionality

Last week Sony finally uncovered the full details and the appearance of their upcoming console, the PlayStation 5. One of the more interesting moments is Sony promising to dish out two version on the market – one with a 4K UHD Blu-ray Drive and the other that’s going to be digital only (in other words, you order your games, movies and apps online and directly to the console, as the current trends dictate). Another cool piece of info is that the PS5 was confirmed to offer cloud functionality: "We are cloud-gaming pioneers,' PlayStation hardware architect Mark Cerny explained to Wired when asked about cloud functionality, "our vision should become clear as we head towards launch.” Meanwhile, “game sizes should smaller or, at least, better optimized. Due to the SSD-only solution with the PS5, developers will no longer need to duplicate data to make a standard 5400 RPM read faster.“ Learn more about the new PS5 console at EG.

Date/Time

Date(s) - 01/01/1970
12:00 AM - 12:00 AM

Location

600 5th ave. NY, NY
[et_pb_section fb_built="1" _builder_version="4.1"][et_pb_row column_structure="1_2,1_2" _builder_version="4.1"][et_pb_column type="1_2" _builder_version="4.1"][et_pb_text _builder_version="4.1"]

Putting it all Together

We continue our journey through Microsoft Flow. After covering some of the Flow calling other Flows and recursion concepts in the previous posts, we are going to go through the overall idea of what we are building and to test out the overall performance.
[/et_pb_text][/et_pb_column][et_pb_column type="1_2" _builder_version="4.1"][et_pb_image src="https://3bb4f13skpx244ooia2hci0q-wpengine.netdna-ssl.com/wp-content/uploads/2020/06/GeekRadar-Medium.gif" _builder_version="4.1"][/et_pb_image][/et_pb_column][/et_pb_row][et_pb_row _builder_version="4.1"][et_pb_column type="4_4" _builder_version="4.1"][et_pb_text _builder_version="4.1"]

We will use Flow to generate some JSON for the organizational hierarchy, which can then be used in various org charts around the business. There is a specific D3.js library, which will be used to display it all, but for now we going to cover the overall structure to generate it all properly.

For transparency, using Flow is probably not the best way to do this, just given the number of workarounds to achieve what should be basic programming techniques. This org chart approach is pushing the limits of what we can do with Flow just to see how far we can get with its functionality. Its huge advantage is that it is available to anyone using Microsoft 365, so even though it may not be an elegant solution at times, it is hugely powerful by putting this kind of automation in the hands of every user.

Microsoft Flow Performance

Each HTTP call has a performance hit, which in a recursive traversal will add up. This is not the most efficient way to traverse a very large organization hierarchy in Microsoft 365. So, a department of about 30 individuals will take roughly 50 seconds to traverse and a larger group of 60 individuals can approach the 120 seconds maximum. These timings can fluctuate, most likely due to the load and capacity of the underlying compute that Microsoft puts at the disposal of Flow.

Caching

There are two Flows, the “Get Manager Org JSON” is currently the main entry point, which pulls the manager details, and then calls the “Get Direct Reports JSON” Flow to get the reports, and that has the logic to call itself multiple times to get all the levels of reports below.

[/et_pb_text][et_pb_image src="https://3bb4f13skpx244ooia2hci0q-wpengine.netdna-ssl.com/wp-content/uploads/2020/06/Two-flows.png" show_in_lightbox="on" _builder_version="4.1"][/et_pb_image][et_pb_text _builder_version="4.1"]

We could look at some form of caching. If the Flow is traversing an organizational hierarchy and along the way encounters a manager has already been mapped out, we could just reuse that. So, if we start with lower level managers, save their results to say a SharePoint library, then assemble that existing JSON into any calls above that manager we can avoid doing the traversal from scratch.

Saving to SharePoint

We start by adding a step (1) to our “Get Manager Org JSON” Flow, which will save the output of the JSON to a SharePoint site. We can use the manager name as the filename to be able to reference that same JSON later.

[/et_pb_text][et_pb_image src="https://3bb4f13skpx244ooia2hci0q-wpengine.netdna-ssl.com/wp-content/uploads/2020/06/Save-to-SharePoint.png" show_in_lightbox="on" _builder_version="4.1"][/et_pb_image][et_pb_text _builder_version="4.1"]

Then in the “Get Direct Reports JSON” Flow we add the corresponding conditional within the loop that if the direct report is a manager and that manager already has the JSON saved to the SharePoint site (2), simply read that JSON rather than traverse. Otherwise if no saved JSON, then traverse.

[/et_pb_text][et_pb_image src="https://3bb4f13skpx244ooia2hci0q-wpengine.netdna-ssl.com/wp-content/uploads/2020/06/Read-from-SharePoint.png" show_in_lightbox="on" _builder_version="4.1"][/et_pb_image][et_pb_text _builder_version="4.1"]

The Result

The larger group which took ~120 seconds to traverse, is now completed in under 10 seconds when assembling 5 pre-populated org charts for the direct reports. This is saves time tremendously with a simple tweak to save results to a SharePoint site. More importantly, this shows how several of the Microsoft 365 features can be joined up to solve an interesting programming challenge.

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="4.1"][et_pb_column type="4_4" _builder_version="4.1"][et_pb_text _builder_version="4.1"]
Expand your knowledge of the Microsoft Modern Workplace, by subrscribing to our YouTube channel:
[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]

Date/Time

Date(s) - 01/01/1970
12:00 AM - 12:00 AM

Location

600 5th ave. NY, NY
[et_pb_section fb_built="1" _builder_version="4.1"][et_pb_row _builder_version="4.1"][et_pb_column type="4_4" _builder_version="4.1"][et_pb_video src="https://www.youtube.com/watch?v=uM5Mw-Fxql4&feature=youtu.be" _builder_version="4.1"][/et_pb_video][/et_pb_column][/et_pb_row][et_pb_row column_structure="1_2,1_2" _builder_version="4.1"][et_pb_column type="1_2" _builder_version="4.1"][et_pb_text _builder_version="4.1"]

There are many paths to take on the road to strong security. Our goal is to help you minimize and ultimately eliminate the risk of losing personally identifiable information and, of course, making sure it's all complaint to ever-changing regulations. These are vital ingredients for enterprise-level business operations, especially when adapting to the COVID-19 crisis. With companies becoming increasingly focused on the stay-at-home environment, protecting data and information has never been more important.

[/et_pb_text][/et_pb_column][et_pb_column type="1_2" _builder_version="4.1"][et_pb_image src="https://3bb4f13skpx244ooia2hci0q-wpengine.netdna-ssl.com/wp-content/uploads/2020/06/GeekRadar-Medium.gif" _builder_version="4.1"][/et_pb_image][/et_pb_column][/et_pb_row][et_pb_row _builder_version="4.1"][et_pb_column type="4_4" _builder_version="4.1"][et_pb_text _builder_version="4.1"]
Also, folks, in order to keep track of all the upcoming webinar episodes, feel free to subscribe to our YouTube channel below:

[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]

Date/Time

Date(s) - 01/01/1970
12:00 AM - 12:00 AM

Location

600 5th ave. NY, NY