Edited by Chris Preimesberger, Author at eWEEK https://www.eweek.com/author/edited-by-chris-preimesberger/ Technology News, Tech Product Reviews, Research and Enterprise Analysis Mon, 12 Feb 2024 17:37:35 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Spectra Logic: Product Overview and Insight https://www.eweek.com/storage/spectra-logic-product-overview-and-insight/ Wed, 14 Apr 2021 21:03:07 +0000 https://www.eweek.com/?p=218706 Company: Spectra Logic (data storage and data management solutions) Company Description: Spectra Logic develops data storage and data management solutions that solve the problem of digital preservation for organizations dealing with exponential data growth. Spectra enables affordable, multi-decade data storage and access by creating new methods of managing information in all forms of storage—including archive, […]

The post Spectra Logic: Product Overview and Insight appeared first on eWEEK.

]]>
Company: Spectra Logic (data storage and data management solutions)

Company Description: Spectra Logic develops data storage and data management solutions that solve the problem of digital preservation for organizations dealing with exponential data growth. Spectra enables affordable, multi-decade data storage and access by creating new methods of managing information in all forms of storage—including archive, backup, cold storage, private cloud and public cloud.

Markets: Spectra Logic is a 40-year-old global organization that sells directly to midsize and large organizations as well as through a worldwide network of resellers and distributors who offer Spectra solutions to customers in a wide range of industries, including media and entertainment, education, government, finance, energy, health care, scientific research and high-performance computing environments, among others. Spectra Logic also has established strong strategic partnerships with Fortune 50 organizations and key technology partners to ensure interoperability and compatibility.

Product and Services

Spectra Logic’s agile and inventive approach has led to more than 125 patents and an expanding portfolio of solutions that have been deployed in more than 80 countries. Spectra offers a wide solution set that includes disk, object storage, tape and hybrid cloud storage in addition to storage lifecycle management software. StorCycle is the company’s flagship storage lifecycle management software that automatically identifies and moves inactive data from primary storage to a lower cost tier that includes cloud, object storage disk, network-attached storage and tape.

Key Features

StorCycle is a storage lifecycle management software that ensures data is stored on the right tier throughout its lifecycle for greater IT and budgetary efficiencies. More than 80 percent of data is being stored on the wrong tier, costing organizations millions of dollars a year. StorCycle storage lifecycle management software can reduce the overall cost of storing data by up to 70 percent by enabling organizations to efficiently scan primary storage and migrate inactive data and finished projects to a lower cost tier of storage for long-term preservation and access.

StorCycle delivers four key elements of data storage lifecycle management:

  • Identification: Scan of an active source file system compiles and presents real time analytics, revealing an actionable view of the data landscape. Scans can be scheduled, throttled as necessary and reused as needed;
  • Migration: Automated migration on the basis of past or upcoming scheduled scans or project-based migration of entire data sets or directories. After migrating data, StorCycle accurately maintains directory structures and Access Control Lists;
  • Protection: Makes and tracks multiple copies on a variety of targets, adding both geographic and genetic diversity into data protection plans;
  • Access: Use of HTML Links or Symbolic Links, and a web-based search maintains data easily accessible by users in a semi-transparent or transparent manner. The software activates archived data, allowing users to apply new technologies.

Interoperable with Linux, Mac and Windows, StorCycle identifies inactive files on primary storage based on policies set by the administrator and migrates those files to a lower cost tier of storage, which includes any combination of cloud storage, object storage disk, network-attached storage and tape. Users also can move entire completed data sets, such as machine-generated data, scientific output and finished videos, with the Project Archive method. This reduces the amount of data residing on expensive primary storage, shrinking backup windows, increasing performance and reducing more primary storage purchases.

Additionally, StorCycle protects data through end-to-end encryption on all storage targets, and storage of multiple copies of data on multiple storage mediums. It is fully ADFS-compliant, meaning file permission remains intact regardless of where data is stored. StorCycle enables organizational data to be stored in two geographically separate locations, for example on cloud and on local NAS.

The scheduled delete feature enables users to configure automatic deletions of migrated data after it has been retained on a storage target for a preset period of time. Other features enable users to prioritize restore jobs, activate one-click job reruns, archive and restore user-generated symbolic links, obtain CIFS/SMB support with Linux, and attain improved file search via background database indexing.

The latest version of StorCycle enables users to leverage the exposed RESTful API to take advantage of StorCycle’s core features, including scanning, migrating, and restoring data to build integrations and applications that leverage StorCycle’s Storage Lifecycle Management capabilities. The exposed API is an excellent tool for advanced users who wish to integrate StorCycle into wider workflows. In addition to providing core commands to configure storage locations, the API helps users build applications to better manage jobs or perform bulk actions without using the web interface.

StorCycle also now extends cloud support to Microsoft® Azure®, including both the standard (Hot/Cool) and Archive tiers. Azure® can be used as a storage target for migration jobs, helping organizations leverage the cost-effectiveness and ease of cloud storage. This is in addition to StorCycle’s existing support for Amazon S3 Standard and Glacier tiers.

Insight and Analysis

There are no user reviews of Spectra Logic on any of the major software review sites, including TechnologyAdvice, G2Crowd, Gartner Peer Reviews, IT Central Station, Capterra and Serchen.

Delivery: Direct from Spectra and through the company’s global network of value-added   resellers and distributors

Pricing: Annual subscriptions. For pricing or information, call 1-720-301-0153 or email sales@spectralogic.com

Contact information: sales@spectralogic.com

eWEEK is building an IT products and services section that encompasses most of the categories that we cover on our site. In it, we will spotlight the leaders in each sector, which include enterprise software, hardware, security, on-premises-based systems and cloud services. We also will add promising new companies as they come into the market. Here is a list of examples: https://tinyurl.com/EW-product-overview

The post Spectra Logic: Product Overview and Insight appeared first on eWEEK.

]]>
NetBrain Launches Multi-Cloud Support, Low-code/No-code Toolset https://www.eweek.com/news/netbrain-launches-multi-cloud-support-low-code-no-code-toolset/ Fri, 09 Apr 2021 18:30:05 +0000 https://www.eweek.com/?p=218686 Network operations software maker NetBrain has released a new version of its main platform, NetBrain v10.0, which the company contends breaks new ground in end-to-end network automation. This is all about ensuring network and application uptime, which can be a killer for enterprise IT systems. Using v10.0, NetBrain said, users obtain advanced multi-cloud network support, […]

The post NetBrain Launches Multi-Cloud Support, Low-code/No-code Toolset appeared first on eWEEK.

]]>
Network operations software maker NetBrain has released a new version of its main platform, NetBrain v10.0, which the company contends breaks new ground in end-to-end network automation.

This is all about ensuring network and application uptime, which can be a killer for enterprise IT systems.

Using v10.0, NetBrain said, users obtain advanced multi-cloud network support, no-code/low-code automation, enhanced incident collaboration and intent-based automation at scale, for any type of network. Each of these attributes helps guarantee frictionless networking.

Automation for network operations is an important requirement for many enterprises, but there are also some barriers to adoption. According to Cisco Systems’ 2020 Networking Trends Report, IT leaders believe that network automation will have the most impact on networking during the next five years. Yet even though 35 percent of network strategists see troubleshooting networking issues as the most resource-intensive and time-consuming activity for network operations today, more than 70 percent of network management is still done manually.

According to the Cisco report, 27 percent of IT leaders identified the lack of necessary skills as a key obstacle to transitioning to an advanced network.

NetBrain v10.0, released April 7, improves a company’s mission to democratize automation to support any network, for any problem, and for every person. By introducing new tools and enhancing existing automation capabilities, v10.0 makes it easier for an entire team to develop advanced automation for troubleshooting, documentation, security, and compliance.

New collaboration features accelerate knowledge sharing and help close the knowledge gap within both the network teams and IT teams in general.

NetBrain v10.0 includes the following features, according to the company:

  • Multi-Cloud Support. NetBrain now offers native, robust support for Amazon Web Services (AWS) and Microsoft Azure public cloud environments, allowing the two most common enterprise cloud networking platforms to be discovered, visualized, and operated in NetBrain on the same dynamic map as the rest of a customers’ enterprise network infrastructure.
  • Intent-based automation (IBA). Network Intent is the first tool of its kind to document network intent for specific devices, predefining a baseline for normal behavior and automating verification for even the most complex network setups. Adaptive Monitoring proactively takes immediate actions on network anomalies by executing those predefined network intents, speeding up troubleshooting to determine root causes instead of symptoms.
  • Incident-Based Collaboration. Network troubleshooting and remediation is a team sport. NetBrain v10.0 introduces a set of incident response tools, including the Incident Management Dashboard for real-time messaging and central problem visualization; the NetBrain Incident Portal for secure, cross-team collaboration (including non-NetBrain users), and SmartCLI for easy sharing of CLI diagnostics across the organization.
  • Low-code/No-code Automation. NetBrain is introducing multiple tools to democratize the development of automations for network operations. These include the Visual Parser guided wizard for isolating critical data variables to build diagnostic automations faster.
  • Feature Intent Template. This automates the act of automating. The Feature Intent Template drives rapid and scalable automation development across the IT organization.

NetBrain10 is now generally available. Current customers can request upgrades by emailing upgrades@netbraintech.com. For more information, go here.

The post NetBrain Launches Multi-Cloud Support, Low-code/No-code Toolset appeared first on eWEEK.

]]>
World Backup Day 2021: It’s All About Automation https://www.eweek.com/storage/world-backup-day-2021-its-all-about-automation/ Wed, 31 Mar 2021 11:12:19 +0000 https://www.eweek.com/?p=218636 Each year, March 31 is World Backup Day, which reminds all of us that we need to back up all our personal and business-related documents, photos, videos—anything else digital that we value. Those of us who have lost important files in the past know what a pain it is—and how much valuable time it takes—to […]

The post World Backup Day 2021: It’s All About Automation appeared first on eWEEK.

]]>
Each year, March 31 is World Backup Day, which reminds all of us that we need to back up all our personal and business-related documents, photos, videos—anything else digital that we value.

Those of us who have lost important files in the past know what a pain it is—and how much valuable time it takes—to recreate them or make new copies, if it’s possible. Thinking ahead and taking the time and attention needed to ensure the protection of data valuables is well worth it and doesn’t cost that much.

There’s really no excuse anymore not to do this: Automated backup into the cloud or to onsite storage is simple to set up and inexpensive. And there are so many vendors willing to bend over backward for your business that you shouldn’t have any problem finding one that fits what your needs exactly.

Successful backup providers include those quoted in today’s article plus Code42, Intronis, iDrive, Nero, OpenDrive, Seagate i365, Carbonite, EMC Mozy, Jungle Disk, Veritas, Zetta, Druva, Asigra, Box, Veeam, Dropbox, Egnyte, Ftopia, Barracuda, SpiderOak and others.

Here are some good pieces of advice from people who know all about this topic.

Doug Matthews, Vice President of Enterprise Data Protection/Analytics, Veritas:

As some employees begin returning to a physical office and others continue working remotely, organizations must pay close attention to protecting themselves from vulnerabilities that come with a hybrid work model. The biggest problem with this model happens as devices “out in the wild” are then physically brought back into the business network. Companies must be aware these devices may be carrying a range of sleeper viruses and ransomware that could put data at risk.

This World Backup Day, organizations must take the time to understand where data lives – and standardize data protection practices across every environment, workload, cloud, and deployment model. Now’s the time for enterprises to ensure all data is properly secured, their employees are trained on security policies, and the right tools are in place to quickly recover if disaster strikes.

Matt Waxman, Vice President, Product Management, Cohesity:

As we reflect on the past year, the world has been turned upside down, so talking about RPOs and RTOs just doesn’t seem appropriate. We are living in an age where ransomware has almost become a household term. Sophisticated ransomware attacks are increasingly targeting backup data in addition to what resides in production, to knee-cap organizations and their last lines of defense. Tackling ransomware is multi-faceted, but without a doubt having a comprehensive data protection strategy with a foundation built around immutability is no longer a nice to have, but a must-have.

All companies, big and small, need to be on their guard and put defenses in place to reduce the chances of becoming the next victim. When combined with the cloud, an immutable file system is an incredibly powerful way of overcoming ransomware attacks. For too long backup has been a chore, or worse, an afterthought. However, in 2021, it is clear that sticking with your existing backup vendor’s protection without thoroughly assessing its immutability credentials is akin to doing nothing, which can no longer be an option.

Rotem Iram, Founder and CEO, At-Bay:

Backups have become important to companies that are exposed to digital risk, and to the insurance companies that protect them. An effective backup approach significantly lowers recovery costs, shortens downtime, and provides an alternative to engaging with criminals in the case of ransomware attacks. All of these factors lower risk and can contribute to lower insurance premiums.

Unfortunately, many companies are failing to put an adequate plan in place. While the vast majority of companies claim to have backups and restoration processes, our data show otherwise:

  • Smaller companies (under $10M revenue) are twice as likely to lack proper backups than larger organizations ($10-100M), and four times more likely to lack proper backups than enterprises ($100M and above).
  • Public administration orgs are 7 times more likely not to have backups compared to private organizations, and small public orgs (under $10 million in budget) are as much as 15 times more likely not to have backups.
  • In the private sector, “offline” / traditional businesses (such as warehouse, logistics, transportation, utilities, agriculture) trail in implementing backups by 2-3x compared to professional services and technology businesses.

These shortfalls are going to cost companies a lot of money. Regardless of size, every company needs to have a strategy in place that ensures backups are segregated from the network, saves copies for 30-90 days (once there is a breach, your most recent backups may include malware), and backs up all important systems and data, even those in secondary locations or legacy systems.

Adrian Moir, Technology Evangelist and Principal Engineer at Quest Software:

The shift to remote work is empowering organizations to take backup and recovery more seriously–especially as many face new challenges navigating the move to the cloud and data proliferation. While best practices for backup and recovery haven’t changed, they have been impacted by the evolving way data is collected and changing nature of that data, requiring targeted backup technologies to adapt to ensure compliance with existing practices.

Organizations need to focus on the data itself when it comes to what the future of work will be. Whether a company’s workforce is at home or in an office space, employees are now working from anywhere, meaning organizations can no longer ensure data is always secured behind enterprise-grade equipment in a corporate environment. However, this doesn’t mean companies should look to create hard environments. These ultimately lead employees to find security workarounds, causing dark data and shadow IT growth.

Just as we’re shifting mindsets on where employees work, companies need to rethink where their data is coming from and where it ends up. This year, World Backup Day serves as a time to reflect on how teams are being enabled to work with data. In today’s remote world, it’s important that easy and efficient solutions are in place to ensure data can be tracked and properly protected, to keep the business running. On top of this, companies must continue to plan and test against that plan. Ensure that backup and recovery strategies put in place are actually effective and if they are not, plan again.

Dave Wagner, CEO of Zix:

Top two common misconceptions:

  • Once you migrate to the cloud your data is safe. 

According to recent CheckPoint research, Ransomware attacks are growing at a rate of 9% per month in 2021. Of course, the more data you have in the cloud the higher the cost of a ransomware attack on your business. With the increase of cloud storage driven by remote work, these risks now apply to all of us! That’s why it’s critical to have alternative solutions and/or a third-party cloud provider for automated backups.

  • Your SaaS provider is a one-stop-shop for backup and recovery. 

Today’s SaaS applications are incredibly powerful when it comes to boosting collaboration and productivity, but they fall short when it comes to the data retention, backup, and restore features needed to recover data at any point in time. Apps like Google G-Suite and Box, for instance, offer a limited grace period for restoring messages and files, after which point, data is gone forever. Other SaaS providers, including Microsoft, directly recommend using third-party backup services.

Three ways to proceed:

  • The need for education on the executive level before empowering the rest of your team. 

It is a critical first step to empower your team to take control of the data in their departments by giving them an all-in-one tool to do it. Teach them how to manage daily backups through reports, how to control user access, and how to recover lost or corrupted data–if and when disaster strikes.

  • Know the difference between backup and archive and why both are important.

While often used interchangeably, backup and archive are two very distinct functions. Backups provide you with a safety net in the case of an outage or breach by storing a copy of your data in real-time in a secondary location (a third-party cloud provider), making it possible to recover.

Archive, on the other hand, is the process of moving certain data that’s no longer in use but needs to be stored and cataloged for long periods of time. This is used mostly for companies that need to comply with data-retention regulations.

  • Get the right automated backup tools in place now. 

The risk of ransomware is real, and your data is a living system that constantly changes during day-to-day operations. That’s why daily automated backups for all of your software platforms from Google Drive to Salesforce and Box are necessary. You want a simple process for managing your data: Set it (archive location, backup time, frequency, etc.), Manage It (reports, backup exports, etc.) and Access It (search and locate data).

Paul Speciale, Chief Product Officer, Scality:

As data becomes increasingly valuable, backup and data protection at large will remain key topics. Storage and backup infrastructure must continually be revisited as business needs evolve.

Over the last several years, we’ve seen the growth of hybrid cloud backup and business continuance strategies. Traditionally, enterprise IT has solved business continuance requirements through redundant infrastructures, in two physical enterprise data centers that are geographically separated. This commonly depended on the use of data replication strategies to maintain redundant data copies across these sites. The real shift in the last 24 to 36 months is that enterprises are now truly embracing hybrid cloud as the mechanism to enable this redundancy, by using a public cloud provider such as AWS or Azure as the secondary site, with hybrid cloud data replication as the means to maintain data synchronicity between the data center and the public cloud.

On-premises backups will certainly remain, since data is now being created everywhere, from the corporate data center, to the cloud and now the emerging edge tier, but a major new trend in the world of IT infrastructure and application development will be a shift to the “cloud-native” model. This will create a demand for an entirely new set of data protection applications that are container and Kubernetes aware, so as to provide cloud-native backup solutions for these environments.

“Most enterprises consider data backup as a type of insurance policy in case something goes wrong. However, for World Backup Day, we’re asking businesses to think beyond using backup data for insurance, and see it for the strategic asset it can be.

Joe Gaska, CEO of GRAX:

Most enterprises consider data backup as a type of insurance policy in case something goes wrong. However, for World Backup Day, we’re asking businesses to think beyond using backup data for insurance, and see it for the strategic asset it can be.

To turn this page, organizations must take ownership of their data – especially SaaS application data, which is where so much of business takes place these days. If you own your cloud app data by backing it up and storing it in your own cloud infrastructure, and control the frequency with which it is captured, not only can you get the granular historical data needed to minimize impact from app data loss and to comply with industry regulations.

Just as important, your business users can get unfettered access to the data generated in those SaaS applications, and feed it into their analytics, ML, AI and other systems. Accessing backup data from your own cloud storage, instead of using APIs to hit the applications themselves, is a much easier and much more cost-effective way to leverage SaaS data for competitive advantage. And it transforms backup into a true business accelerator.

Ajay Sabhlok, CIO & CDO at Rubrik:

Ransomware attacks have become increasingly frequent over the past year, and it’s no longer a question of if an organization is going to become a victim of ransomware, but when. When businesses suffer an attack, CIOs are faced with two options: those whose data have been compromised are forced to pay a ransom, and those who have properly backed up their data must determine how long it will take to recover their safeguarded data. If they can’t recover quickly enough, it might be financially necessary to pay a ransom and avoid suffering downtime, which could have long-lasting reputational effects.

As we approach World Backup Day in 2021, CIOs should focus on prioritizing backup and recovery solutions that natively store all data in an immutable format, enabling them to recover quickly to avoid the poor economics of ransomware recovery. It’s clear that legacy solutions don’t meet the needs of businesses today, as companies that have prepared for an attack can still be left struggling to decide whether they should quickly pay a ransom or spend more time and money recovering from an attack and subsequent downtime.

The post World Backup Day 2021: It’s All About Automation appeared first on eWEEK.

]]>
Trends Involving IT, Offices and Sales in 2021 https://www.eweek.com/it-management/trends-involving-it-offices-and-sales-in-2021/ Sat, 27 Mar 2021 00:51:48 +0000 https://www.eweek.com/?p=218612 COVID-19 has rocked most sectors and individuals. As the nation holds its breath for the end of the pandemic, many industries are looking at the post-COVID-19 recovery landscape and trying to figure out what it means. Many companies with substantial brick-and-mortar investments are seeing their square-footage use shrinking as more and more employees work from […]

The post Trends Involving IT, Offices and Sales in 2021 appeared first on eWEEK.

]]>
COVID-19 has rocked most sectors and individuals. As the nation holds its breath for the end of the pandemic, many industries are looking at the post-COVID-19 recovery landscape and trying to figure out what it means.

Many companies with substantial brick-and-mortar investments are seeing their square-footage use shrinking as more and more employees work from home offices or kitchen tables. Will this past year completely change work environments for years to come? Many thought leaders believe that the answer to that is yes.

For sales professionals, recovery is certainly a windfall, but it also presents challenges. The fact is, many sales pros found their customers closing up shop, and while the outlook is starting to become more positive, many customers may still be hesitant to purchase.

This post-pandemic uncertainty will cause a lot of stress as the new normal becomes more evident – so what do sales pros need to know as they maneuver out of isolation and hit the road again?

In this article, two respected professionals, Poly CTO David Bryan and Pipeliner CRM CEO Nikolaus Kimla, offer the following “post-pandemic” trends and predictions for the office-space and sales industries, respectively.

David Bryan, CTO of Poly:

Trend No. 1: Real estate costs are transitioning to IT investment in the post-pandemic world.

We’re hearing from our customers that as their real estate footprint is shrinking, those savings are being reinvested back into their technology so that people in disparate locations – some in an office, some at home – can communicate with one another. Offices in the future will primarily be used for group work such as meetings, team building and strategic planning, while independent work will likely be done at home. IT decision-makers are recognizing that every meeting room will need to be equipped with modern, AI-driven collaboration technology – as each will be needed to routinely connect to the much larger percentage of remote workers. This means flexibility will be an increasingly important attribute that managers will come to expect of their office spaces, and something they will look to IT teams to facilitate.

Trend No. 2: AI playing an essential role in office buildings.

We will see highly imaginative applications of AI, especially when it comes to touchless control. We are working in the lab on new experiences, powered by voice control, gesture control, and more advanced technologies. In this day and age, no one wants to be touching surfaces in the lobby, conference room or shared workspaces. AI will increasingly alert IT and facilities management to important information we can take action on—such as a meeting room that is exceeding capacity or notification that a meeting is over so the room can be cleaned. We are going from a world where advanced room analytics and insights are not just about best practices for managing space but are also about being responsible organizations and keeping our employees healthy.

Trend No. 3: High-quality video conferencing technology becoming an equalizing force.

In 2021, intelligent control and high-quality audio, video and collaboration devices will be essential to “level the playing field.” Video helps remote participants get as close as possible to the experience of being in the same room, and as hybrid working becomes the norm, it will be essential to ensuring cohesive business operations and culture. True collaboration requires great systems tailored for those who are venturing into the office, as well as those collaborating from their homes. This requires more than just simple web cameras and is an area where industry-leading hardware makes all the difference. This is another area where AI and machine learning will be critical. The devices in the office need to be aware of the room, participants and the meeting flow to deliver that “in-the-room” feel to the remote participants. AI can also help optimize things for those at home, and ensure home users can project a professional image. For example, our world-class audio, and features such as advanced NoiseBlockAI, can help to mask background noise, wherever you are.

Trend No. 4: Accelerated digital transformation and adoption of personal collaboration devices.

Now that the remote working model has proven itself to be more useful than was previously accepted, expect to see more products designed for personal use, along with accelerated adoption of cloud-based management. With hybrid working here to stay, individuals will seek flexibility in devices that offer comfort, quality, mobility and style whether they choose to work from home, the office or another co-working space. Inevitably, enterprises will look to embrace device management of these products for on-prem as well as a remote workforce. It’s about more than just sending someone with a couple of devices they picked up from the local store. It’s about ensuring employees at home have the same professional, high-quality, and easily managed systems and experience they have come to expect in the office.

Nikolaus Kimla, CEO of Pipeliner CRM

Prediction No. 1: Digital transformation and automation will be critical in the post-pandemic world to help sales pros to facilitate more creative customer interactions.

In the post-pandemic world, sales teams need to take a more personal and empathetic approach to selling to re-establish relationships after long periods of isolation. This is only possible if organizations can recast their business models by digitizing repetitive processes that will free up sales teams to focus on more high-value activities that can improve customer relations.

Automation, which has been instrumental in driving a data-driven approach to sales processes, enables the digitization of administrative tasks as well as helps sales teams to procure more customer intelligence to develop strategies for approaching customers.  This is especially crucial if the customer has been through a challenging year and may be hesitant to re-engage.

Now is the time for organizations to develop digital processes that define success as well as encourage experimentation and risk-taking.

Prediction No. 2: The education of future sales pros must include a combination of management and technical skills.

The fact is, business schools have not kept pace with the sales industry. As a result, there has been a dearth of sales talent, along with extremely negative attitudes within academic institutions about teaching and learning sales.

As we enter into post-pandemic recovery, academic institutions need to develop a new method to teach sales and a first step is to partner with corporate entities in order to develop work-ready talent. Additionally, institutions must take a digital-strategy and expose students to the latest sales technologies such as CRM as well as several industry automation solutions.  Students should also be taught social media marketing tactics in order to prospect new customers and develop buyer dossiers.

Another key approach is to train future sales leaders on soft skills, as a Carnegie Mellon study stipulates that 75% of success in business is tied to soft skills, which enables more empathy and understanding with customers.  This tactic also includes “adaptive learning” — enabling students to understand what characteristics, skill sets, behaviors, and competencies are required to ‘adapt’ to different personality types.

Prediction 3: The wrong analytics can lead to devastating results. In the post-pandemic world, deeper and more accurate analytics can provide more impact and better decision-making.

The Covid-19 pandemic showed us what happens when data is interpreted incorrectly. In the past year, there were several instances when statistics were wrongly analyzed or facts incorrectly publicized – which can spell panic and confusion.

In business, today sales teams require deep analytics, with accurate data and actionable insights which can lead to more efficient and effective decisions. When data is incorrect, wrong decisions are made.  Therefore, automated analytics tools are necessary to interpret data which allows sales teams to implement decisive sales strategies that will prove successful.

Analytics tools should include the four primary types of analytics: descriptive, diagnostic, predictive, and prescriptive. These tools must interconnect so when data is examined, the results are meaningful.  The goal of data is to remove problems, reduce risk, and to leverage opportunities.  When sales pros can fully understand indicators and metrics, and deeply view actual issues, they can be proactive instead of reactive.

 

The post Trends Involving IT, Offices and Sales in 2021 appeared first on eWEEK.

]]>
What Execs Have Learned about Migrating Legacy Systems to Cloud https://www.eweek.com/it-management/what-execs-have-learned-about-migrating-legacy-systems-to-cloud/ Fri, 19 Mar 2021 17:42:55 +0000 https://www.eweek.com/?p=218564 According to ResearchAndMarkets.com, the global application transformation market is expected to grow from $9.7 billion in 2019 to $16.8 billion by 2024. According to the analytics firm, the market is being driven by the need companies have for “a robust and agile environment to increase scalability and efficiency in the existing business landscape, (the) high […]

The post What Execs Have Learned about Migrating Legacy Systems to Cloud appeared first on eWEEK.

]]>
According to ResearchAndMarkets.com, the global application transformation market is expected to grow from $9.7 billion in 2019 to $16.8 billion by 2024. According to the analytics firm, the market is being driven by the need companies have for “a robust and agile environment to increase scalability and efficiency in the existing business landscape, (the) high maintenance cost of existing legacy applications and leveraging emerging technologies and increasing efficiencies of existing applications.”

The ResearchAndMarkets.com report aligns with the findings of Lemongrass’s 2021 Legacy-to-Cloud study, which was released March 18 and includes input from more than 150 IT leaders–the majority of whom work in companies with at least $100 million in annual sales.

Following are the study’s highlights. The complete Lemongrass 2021 Legacy-to-Cloud report can be found here.

Data Point No. 1: Data access, security, cost savings motivate cloud migrations.

A combined 77% of IT leaders responding to Lemongrass’s survey said their primary motivation for migrating legacy systems to cloud infrastructure was either a desire to secure data, maintain data access or save money. Optimizing storage resources and accelerating digital transformation were other top reasons given. Meanwhile, 78% of respondents said that IT management systems were the most likely legacy applications to move to the cloud, while 46% said security and 39% said e-commerce.

Data Point No. 2: Security, people, process, cost complicate cloud migrations.

Considering the importance of data security, it’s no surprise that security and compliance were listed by 59% of IT leaders as the top challenge facing enterprises when moving legacy systems to the cloud. Meanwhile, 43% of respondents said migrations took too long, 38% said costs were too high, and 33% said a lack of in-house skills was the top complicating factor.

Regarding cost, 69% of respondents said the typical legacy-to-cloud migration cost between $100,000 and $250,000, and 57% of respondents said that somewhat or very rarely do these projects come in under budget. In terms of job skills, database integration experience was cited by 21% of respondents as the top skill required for performing legacy migrations, followed by experience with the chosen cloud platform (15%), previous migration experience (12%) and testing validation (also 12%). Sixty-eight percent of respondents said it was very or somewhat hard to find people with these skills, and 40% said migrations took at least seven months to complete.

Data Point No. 3: Employee training, data security are the top operating challenges facing enterprises once legacy systems have been migrated to cloud infrastructure.

Forty-two percent of survey respondents said that difficulty training end users was the top challenge to using legacy systems now running on cloud infrastructure. Forty percent said the top challenge was that security concerns had not been adequately addressed, 34% said the cloud platforms they had chosen did not work as expected, and another 34% said the top challenge to running legacy systems in the cloud was a lack of in-house skills. Meanwhile, 60% said that multi-cloud management skills were the most important job skill for IT professionals in terms of running legacy systems in the cloud. Half the respondents said the top skill was database management, and 48% said programming. 71% of respondents said it was hard or somewhat hard to find these people.

Data Point No. 4: Lessons learned: Migrating, running legacy systems in cloud

The top three lessons learned when migrating legacy systems to the cloud were:

1) allow for sufficient time (54%);
2) dedicate sufficient financial and people resources (52%); and
3) ensure that you have the correct people/skills in-house (52%).

The top three lessons learned when running legacy systems on cloud infrastructure were:

1) allow for sufficient time to manage the application (53%);
2) ensure that you have the correct people/skills in-house (52%); and
3) ensure that you achieve the outlined business goals (46%).

“The survey findings are very consistent with feedback we receive from our customers,” Vince Lubsey, CTO at Lemongrass, said in a media advisory. “Enterprises are anxious to reap the benefits of moving legacy systems to the cloud. They understand there are challenges but the benefits far outweigh the obstacles. The key to success is following best practices, proper training and time management. It also helps to have the guidance of an experienced partner to create the required cloud operating model.”

If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.

The post What Execs Have Learned about Migrating Legacy Systems to Cloud appeared first on eWEEK.

]]>
IT Science Case Study: How to Reach a High Level of Observability https://www.eweek.com/news/it-science-case-study-how-to-reach-a-high-level-of-observability/ Tue, 09 Mar 2021 21:25:52 +0000 https://www.eweek.com/?p=218507 Here is the latest article in an eWEEK feature series called IT Science, in which we look at what actually happens at the intersection of new-gen IT and legacy systems. This one’s about how to reach observability at a high level. Unless it’s brand new and right off various assembly lines, servers, storage and networking […]

The post IT Science Case Study: How to Reach a High Level of Observability appeared first on eWEEK.

]]>
Here is the latest article in an eWEEK feature series called IT Science, in which we look at what actually happens at the intersection of new-gen IT and legacy systems. This one’s about how to reach observability at a high level.

Unless it’s brand new and right off various assembly lines, servers, storage and networking inside every IT system can be considered “legacy.” This is because the iteration of  hardware and software products is speeding up all the time. It’s not unusual for an app-maker, for example, to update and/or patch for security purposes an application a few times a month, or even a week. Some apps are updated daily! Hardware moves a little slower, but manufacturing cycles are also speeding up.

These articles describe new-gen industry solutions. The idea is to look at real-world examples of how next-gen IT products and services are making a difference in production each day. Most of them are success stories, but there will also be others about projects that blew up. We’ll have IT integrators, system consultants, analysts and other experts helping us with these as needed.

Today’s Topic: Committing to data-driven engineering and reaching high level of observability

Name the problem to be solved: Armis is a leading agentless device security platform, and its systems generate massive amounts of data. With no unified logging solution, whenever the engineering teams had to troubleshoot an issue, they needed direct server access for each system.

Not only did this create permissions issues, it was not providing them with a comprehensive understanding of the data across multiple systems. Additionally, such large amounts of data and the natural unpredictability of data flows, often resulted in exceeded quotas and billing overages.

What the team really needed was a self-serve solution that would allow developers access to all of the relevant system logs for real-time monitoring and alerting at scale, with integrations to their workflow and management tooling.

Describe the strategy that went into finding the solution:

When Roi joined Armis as the new head of DevOps, he immediately saw the need to bring in a solution that would allow for optimization of the workflows as well as scaling coverage. He was a happy customer of Coralogix in his previous company and worked to implement the platform in Armis as well.

List the key components in the solution:

The solution enabled data-driven engineering with features such as data prioritization and filtering and data normalization. The prioritization of data means only critical logs are sent to hot storage while the rest are monitored in real-time using Coralogix Streama service and then directed to an S3 Bucket. Normalization of the data is also an important component as the data sources span from dev and testing to production. This feature helps to standardize log templates so that fields are unified across logs written by different developers in different systems.

The team at Armis also saw immediate value in Coralogix’s Live Tail feature, which gives a centralized view of all system logs in real time, as well as the Coralogix CLI, which allows the developers to access logs in the dev stage without using the browser.

Dynamic alerting and error ratio alerts are the cherry on top, along with version tagging and additional integrations to CI/CD tools, which help to accelerate version delivery and time to market while improving stability and quality.

Describe how the deployment went, how long it took, and if it came off as planned:

The initial integration took just a few hours, and then the Coralogix Support Team helped to get the setup completed for data input, parsing, dashboards and basically anything else needed.

Unlike traditional tools, the platform is very intuitive and requires very little training to use, which has led to widespread adoption across the entire engineering department.

Describe the result, new efficiencies gained, and what was learned from the project:

As a result, the team has achieved a high level of observability and is able to provide customers with a smooth and highly performant experience.

Some of the most impressive improvements that led to an increase in observability included standardizing all log data including legacy logs, reducing data usage by half for more economic monitoring and better coverage unrestricted by cost, improving the CI/CD pipeline from deploy time and automation tests to quick-impact analysis of new versions.

Today, the team has implemented a data-oriented development environment across the organization.

Describe ROI, carbon footprint savings, and staff time savings:

At the same time that the team is getting more impactful insights into the data, they’ve also saved almost $200K/year on storage costs, thanks to the data prioritization done by the Coralogix platform. The team is using Coralogix to attain broad coverage of areas that weren’t monitored at all in the past and to successfully serve some of the largest companies in the world with top-level SLA.

If you have a suggestion for an eWEEK IT Science article, email cpreimesberger@eweek.com.

The post IT Science Case Study: How to Reach a High Level of Observability appeared first on eWEEK.

]]>
Tech Women ‘Choose to Challenge,’ Call Out Gender Bias https://www.eweek.com/it-management/tech-women-choose-to-challenge-call-out-gender-bias/ Mon, 08 Mar 2021 20:47:24 +0000 https://www.eweek.com/?p=218496 [Editor’s note: Today, March 8, is International Women’s Day. eWEEK is publishing a couple of relevant features involving women in the IT business. Here is the second one.] International Women’s Day, which this year falls on Monday, March 8, calls on everyone to “choose to challenge”–which means to call out gender bias and inequality and […]

The post Tech Women ‘Choose to Challenge,’ Call Out Gender Bias appeared first on eWEEK.

]]>
[Editor’s note: Today, March 8, is International Women’s Day. eWEEK is publishing a couple of relevant features involving women in the IT business. Here is the second one.]

International Women’s Day, which this year falls on Monday, March 8, calls on everyone to “choose to challenge”–which means to call out gender bias and inequality and seek out and celebrate women’s achievements. Too much of this has been inhibiting women in their chosen careers for many generations of male-dominated business dealings.

In reflection of the annual day of recognition and reflection, Pluralsight, the online tech skills platform, asked its female course authors to share their thoughts on women in tech, personal successes and learning opportunities in their career and offer some advice to women entering the tech industry.

Alice Meredith

Bio: https://www.pluralsight.com/authors/alice-meredith
Expertise: Leadership
Location: Salt Lake City, UT

Why does tech need women and girls? Why do you think it’s important for more women to join the tech industry?

Creating cultures that are welcoming and inviting for women within tech companies is no longer something to dabble in; it is a requisite for success. Tech companies must embrace the diversity of thought and innovation that women render.  This ensures the products and services they provide are appealing to the largest single economic force in the world, American Women who control over 70% of all household spending in the U.S.  Tech companies are doing much better at recognizing the value of and recruiting this previously untapped resource, yet they have been slow to create cultures that welcome and embrace gender diversity.

What was your biggest success and biggest learning opportunity in tech?

As a culture strategist, my greatest success in supporting tech teams stems from my ability to help leaders identify and rectify non-inclusive culture gaps.

My greatest learning opportunity came when I realized my fear that I wouldn’t feel comfortable working within a tech industry was unfounded.  A recent study showed that one of the top three reasons why young women aren’t choosing tech careers is that they feel they wouldn’t feel comfortable working in a tech environment; I too had this fear, but it was unfounded, and I easily found my place and was able to make a difference.

What advice would you give to a woman considering a career in the tech industry? What do you wish you had known?

My experience in tech flows from “supporting tech” teams rather than being “in tech”; this by itself is a great example of the many opportunities available for women, with or without technical backgrounds, to flourish within a tech environment. My advice to women considering a career in the tech industry is to stop considering it–go for it. Stop thinking you don’t have as much knowledge or experience as someone with a grander tech-focused degree or background. Tech is advancing so rapidly that everyone is learning together. Your voice, your diversity of thought, and the talent you bring to the table are enough, and you are needed.

Kate Gregory

Bio: https://www.pluralsight.com/authors/kate-gregory
Expertise: Software development (C++)
Location: Ontario, Canada

What role can male team members play to best support women peers in the tech world?

A good ally supports a team member, especially when they are getting unfair treatment from elsewhere. So if someone keeps interrupting me in a meeting, an ally will say “I don’t think Kate had finished her point” or “Kate is still explaining X to us”. If I make a suggestion, and then someone else suggests it as though it were his own, an ally might say “I think it’s great you agree with Kate’s suggestion, so do I.” One thing a lot of allies want to do is ask me what they should do – this can be tiring, and I don’t always want to speak for all women, so doing the research themselves is another way to take some of the weight off me. They also need to examine themselves to see if they’re making assumptions about the women they work with, even if they are then acting in a nice and supportive way based on those assumptions. For example, assuming the women are the ones with childcare obligations and then adjusting schedules. Not all the women will have childcare obligations and they aren’t the only ones who will.

Also putting people forward for things is good for everyone. If you get an opportunity (eg being on a panel) you can’t take, why not suggest a woman for it? I know several men who won’t be on all-male panels and suggest women to take their place and make a more diverse panel. If someone you work with somehow never seems to be suggested for the task force or the special committee or whatever, try suggesting them. (Just don’t exhaust your only woman in the company by making her be on everything, because all the committees want to have a woman on them.)

Also, since many women undervalue their time, telling someone a good price to charge or a good salary to expect can be super valuable. If you’re bringing in a trainer or consultant or speaker and you ask them what they want for the gig and the number they reply is half what it should be, tell them that. Or suggest a number first. Giving someone a good idea of what they can actually charge will bear fruit for many years to come.

What do you think is the best part of being a woman in the tech industry?

For better or worse, being part of an under-represented group in your industry makes you memorable. I don’t always want to be the only person in the group who anyone remembers later, but I have learned to make the best of it. I tend to stand out in crowds\, and that can be an advantage. I just mostly like being in tech because it’s what I’m good at and it’s what I like to spend my time doing.

What advice would you give to a woman considering a career in the tech industry? What do you wish you had known?

The industry today is not at all like the one I joined in the late 70s. It’s actually more sexist and more difficult for many women. I would encourage anyone just starting in tech to find a peer support group, people who face the same issues, whether that’s being a woman in tech, being a person of colour in tech, being a late-career-changer in tech, or anything else. There is a warmth and acceptance in a good peer group that can carry you through a lot – and there’s often some highly practical advice, too. I tell young women today that if you find a horrible coworker or a horrible employment environment, that’s not about you, it’s about them, and better workplaces and coworkers exist. Don’t let the bad ones push you out. Find a place with less bad ones. I know that’s hard work, but at least it’s possible. You don’t need to quit tech because of a horrible workplace. They are the ones who are not good enough; you’re terrific.

Julie Lerman

Bio: https://www.pluralsight.com/authors/julie-lerman
Expertise: Software development (.NET, Docker, MSFT apps)
Location: Vermont, U.S.

How did you get started in tech? What experiences led you to tech as a career?

My first post-college job in NYC landed me at a magazine publisher. I eventually moved to a role working for the head of accounting, who had the only computer in a company of 1,000. Within a few days of starting that role, his computer was on my desk and I was figuring out how to make it perform our difficult tasks. I am going to totally out myself here and confess that this was in 1984. As more people in the company started getting computers, they kept coming to me for help since I wasn’t afraid to experiment. A few jobs later, someone had left behind a dog-eared copy of a dBase III programming book and I used that to teach myself how to automate some of the drudgery that my job required. This wasn’t exactly my first time coding, though. I had taken the only programming course offered in my college—BASIC—taught on some HeathKits built by our math professors. It was an all women’s college and one of those profs was a woman (nod to Carol Shilepsky!). So I never thought twice about whether or not, as a woman, I had a place in front of computers. Therefore I had some concept of what that dBase III book was leading me through.

What biases have you encountered along your journey and how have you combatted them?

Early on when I was at the FoxPro user group meetings in NYC (late 1980’s), I was one of four women in the group. The other three were older and highly respected in the community. I was in my mid-20s. The only time anyone near my age would talk to me was in order to ask me out on a date.

As a  young woman, I definitely was taken for granted at a few jobs. In two of these, when I gave notice having found more interesting work, the response was the same. A man in upper management who had always treated me like a wunderkind, begged me to stay and suddenly offered me roles (and pay) that were relevant to the work I was already doing and could easily be doing as well. This was always a case of too little too late and I told them thanks but no thanks.

And in both cases, they turned on me with threats. This coming first from a guy who I thought was like a father figure and second from a man who was a former Navy Seal. That was a scary situation. I was 24 in one case and 26 in the other. Certainly not used to having to defend myself. But I somehow held my cool and braved the tirades and was happy to see what a good idea it was for me to leave. In my 30 years in tech, those long ago events still are the most memorable. Now that I’m much older and have a lot of street cred, I don’t generally feel that I’m facing bias, although maybe I’m just oblivious in my rose-colored glasses. If anything, I do worry about facing age bias now. But I stopped caring about it when it comes to my own work.

What advice would you give to a woman considering a career in the tech industry? What do you wish you had known?

I had an amazing role model growing up (my mother) and was raised with the belief that I could do whatever I wanted. It never occurred to me, even when consistently being one of the only women in the room, that I didn’t belong in tech. I want to share this attitude with anyone who is typically told or shown that they don’t belong in tech. There are many communities and businesses that are welcoming that have a healthy, diverse environment where you are seen and heard, where you are given opportunity to learn and grow. Lean on your instinct to help find that place and don’t accept doubts that someone attempts to push into your head.

Cecilia Lejeune

Bio: https://www.pluralsight.com/authors/cecilia-lejeune
Expertise: Product Design, Project Management, VR
Location: Paris, France

Why do you think there are fewer women in technology than in other sectors? 

From the discussions I had with young ladies looking for a career in tech, it’s just that they are passionate about tech because they had been exposed to video games or electronics during their youth. Most of the time, the will to work in tech for a young girl doesn’t come from a fantasized idea, it comes from a concrete and memorable experience. The key to bringing more women into the sector for me is to mind how we communicate about the daily life of a woman working in tech, and how this type of career is perceived by the citizens. The goal is to show young girls that this is an option for them.

How much do you think the tech world has changed for women since you started working in tech?

I started working in tech seven years ago. And oddly enough, I felt more awkward in my work relationships in the past couple of years than when I started. Why? Because of the strong movements around #metoo and because of the many behaviors from men to women that were accepted before and that today seems out of place or questionable. This has brought some layers of complexity and some distance in the relationships I had with colleagues and direct management for example. If I was able to blend in a team of men easily before, today it would demand a lot of effort. I do hope that next generations will understand the importance of developing trust and open dialogue in their work relationships.

What advice would you give to a woman considering a career in the tech industry? What do you wish you had known?

Personally, I wish I had known that salary-wise, it’s not okay to be paid less than a male coworker that does the exact same job. I was too shy to speak out in my case, but it’s your entire life that is at stake! So kindly and respectfully point the difference to your manager and ask the reason for it. HR is very aware of these issues today, so get their help in the negotiation if you need to.

The post Tech Women ‘Choose to Challenge,’ Call Out Gender Bias appeared first on eWEEK.

]]>
Female Coders Explain the Rewards of App Creation https://www.eweek.com/news/female-coders-explain-the-rewards-of-app-creation/ Mon, 08 Mar 2021 16:03:40 +0000 https://www.eweek.com/?p=218491 [Editor’s note: Today, March 8, is International Women’s Day. eWEEK is publishing a couple of relevant features involving women in the IT business. Here is the first one.] Written by Lori Lorusso, JFrog It all starts with support, whether it was tinkering with a computer at age 4, tunneling with free AOL discs to avoid […]

The post Female Coders Explain the Rewards of App Creation appeared first on eWEEK.

]]>
[Editor’s note: Today, March 8, is International Women’s Day. eWEEK is publishing a couple of relevant features involving women in the IT business. Here is the first one.]

Written by Lori Lorusso, JFrog

It all starts with support, whether it was tinkering with a computer at age 4, tunneling with free AOL discs to avoid the “Net Nanny,” or changing majors from electrical engineering to computer science. All of the women in this report have families that fostered their curiosity, education, and championed their talents which brought them to where they are today: Developer Advocates at Sunnyvale, Calif.-based JFrog. JFrog provides software developers with a binary repository management solution.

If there is one thing for certain about the featured “Frogs” in this article is that they have a unifying characteristic: a passion for code. Each has demonstrated a commitment to tinkering with, writing, hacking on and sharing code with the tech community. Each has been present at various conferences and meetups during the last year, and there has not been one instance where they let the community down; no egos, no questions left unattended, just all tech all the time.

Three women, two generations, one passion: Code

Starting with Melissa McKay, we get a mirror of how tech has transformed for Generation X. Generation Xers grew up when having home computers was a luxury, high schools taught typing, and Brother or IBM word processors replaced typewriters. This was the ’70s and into the early ‘90s.

“There are other ways of counting … besides base 10 … I was like what?!” she said. This “aha” moment that Melissa recalled from her first days in college set the stage for more discovery and exploration and led to the growth of a math nerd into a full-fledged computer geek. The smile on her face and the laughter and asides that accompanied the story of her introduction to coding accurately reflect the persona she is when she presents at conferences. The spark that she felt that day is present when she shares technical content with the community in conference talks, blog posts and has helped propel her to become a Java champion.

“I didn’t even know coding was a thing before college,” she said. Melissa received a computer as a high school graduation gift from her grandfather. She started school with the intent of getting an electrical engineering degree, but after an Intro to Programming class and learning C++, she changed her major to computer science.

Next-gen developers

The next or current generation of developers grew up with computers not just in their homes but with their own personal devices. Kat and Batel represent this segment of the tech population. 

Kat Cosgrove grew up with computers. Her dad was an engineer; they had a computer room when she was a kid, and by the time she was in middle school she was given her first personal computer. This brings us to Net Nanny and Kat’s desire to play internet role-playing games that were blocked by her online parental control. A few AOL free internet discs later, her dad discovered that she bypassed her nanny and was continuing to play IRC games.

“That was my first foray into evading authority with technology, and now I get to do that semi-professionally. That was my first exposure to … I hesitate to call it hacking, but it was definitely subversive at the time,” she said. When her dad found out, he congratulated her for finding a tech workaround for the system he tried to put in place. This led her to creating Geocities sites, then MySpace taught her CSS (due to code appearing in comment boxes), and teaching others how to do the same. Her desire to figure things out and then share that knowledge with her peers hasn’t stopped. She continues to teach others in her DevOps 101 workshops and webinar series. 

Batel Zohar is JFrog’s newest Developer Advocate, but she has been with JFrog the longest and knows the most about the company’s DevOps Platform. Just like Kat, Batel grew up with computers, and her inquiring mind also had her tinkering on the hacker edge.

“I remember that I was a girl who loved to hack and play with computers when I was a kid,” she said. At 4 years of age, her grandmother gave her a computer, and at 12, her cousin, a computer technician, was asking her to format computers, install games and software. On her own, she was trying to break stuff with injection attacks.

All three developers took their desire to figure out how to break things and make them work and transitioned into having successful careers as developers.

Developing careers as developers

Melissa went to college, took a 10-year break to raise a family, taught herself web design and web application development during that “time off,” and then went back and finished her degree. She has had several roles throughout her career: developer, engineer, team lead and then full-time developer advocate at JFrog.

Kat dropped out of college, also taught herself web design, and then went to code boot camp, taught at that boot camp after graduating, joined the IoT team at JFrog and was recruited after a year by the DevRel team to become a developer advocate.

Batel was a developer in the Israeli Army, working in multiple coding languages. She was a web developer for a short time, then joined JFrog as developer support engineer, moving up to enterprise solution lead and now developer advocate.

In addition to having a passion for tech, all three share a passion for teaching and helping others solve problems. Throughout their development as engineers and developers, they asked questions, took risks in terms of challenging themselves to keep learning and not get complacent, and now are taking that knowledge and sharing it with others in the community.

Empowering others to succeed

When asked how to become successful or what to do as a developer who wants to progress in their career, here is the advice they shared:

Melissa: “Everything that you learn, find someone to teach. Even if you’re a junior developer and you just learned something new, and it’s new to you and you think everyone else already knows it, find a way to teach it to your team. That will help you get more confidence with presenting, public speaking and organizing your thoughts. If you do that enough, you are going to get noticed. You will be that person who people go to when they have questions, you’ll become the expert on whatever area you choose; I think that is a good way to advance yourself.”

Kat: “It’s important for you to get used to asking questions, to get comfortable with it, and understand that asking questions no matter how simple they do not make you look stupid and it does not make you incompetent. It is your senior engineers and your team lead’s literal job is to answer those questions. They are responsible for your development in your career as long as you’re at that company, and as long as you are on their team. Make them do their jobs. That’s a free resource. Don’t sit there and struggle to bang on your keyboard for three hours, just ask somebody. Knowing when to ask a question is a skill and it makes you look smart, not stupid.”

Batel: “You really need to not be shy and a big part of that is asking questions. You may start as a junior support engineer and you may be asked tons of questions you don’t know the answer to; that’s totally fine, because you have resources within your team and the community that can help you. Find a community; it could be within your own team or outside of your organization altogether. Once you learn something new, don’t be afraid to share it with the community, because the more you explain things the better you will understand it.”

You can watch Melissa, Kat and Batel in action at one of JFrog’s upcoming workshops, and make sure you ask questions!

Lori Lorusso is on JFrog’s Developer Relations team as the Community Meetup Manager.

The post Female Coders Explain the Rewards of App Creation appeared first on eWEEK.

]]>
Samsung Puts Intelligence into High-Bandwidth Memory https://www.eweek.com/pc-hardware/samsung-puts-intelligence-into-high-bandwidth-memory/ Fri, 05 Mar 2021 23:00:23 +0000 https://www.eweek.com/?p=218483 Samsung, more widely known for making television monitors, smartphones and other popular consumer devices, also is a world leader in producing computer memory. The North Korean IT giant announced that it has developed the industry’s first high-bandwidth memory (HBM) chip that’s integrated with artificial intelligence processing power—the HBM-PIM. Like Intel, AMD, NVIDIA and others are […]

The post Samsung Puts Intelligence into High-Bandwidth Memory appeared first on eWEEK.

]]>
Samsung, more widely known for making television monitors, smartphones and other popular consumer devices, also is a world leader in producing computer memory. The North Korean IT giant announced that it has developed the industry’s first high-bandwidth memory (HBM) chip that’s integrated with artificial intelligence processing power—the HBM-PIM.

Like Intel, AMD, NVIDIA and others are baking security, networking and other functionality into processors, Samsung is doing the same, only with AI. The new processing-in-memory (PIM) architecture brings real-time AI computing capabilities inside high-performance memory so as to accelerate large-scale processing in data centers, high performance computing (HPC) systems and AI-enabled mobile applications.

The pioneering HBM-PIM is the industry’s first programmable PIM solution tailored for diverse AI-driven workloads such as HPC, training and inference, Samsung said. The company plans to build upon this by further collaborating with AI solution providers for even more advanced PIM-powered applications, the company said.

The HBM-PIM design has demonstrated “impressive performance and power gains on important classes of AI applications,” Rick Stevens of Argonne Labs said in a media advisory.

Most of today’s computing systems are based on the von Neumann architecture, which uses separate processor and memory units to carry out millions of intricate data processing tasks. This sequential processing approach requires data to constantly move back and forth, resulting in a system-slowing bottleneck especially when handling ever-increasing volumes of data.

Instead, the HBM-PIM brings processing power directly to where the data is stored by placing a DRAM-optimized AI engine inside each memory bank — a storage sub-unit — enabling parallel processing and minimizing data movement. When applied to Samsung‘s existing HBM2 Aquabolt solution, the new architecture is able to deliver more than twice the system performance while reducing energy consumption by more than 70%, the company claimed. The HBM-PIM also does not require any hardware or software changes, allowing faster integration into existing systems, Samsung said.

Samsung’s paper on the HBM-PIM was selected for presentation at the renowned International Solid-State Circuits Virtual Conference (ISSCC), which ended Feb. 22. Samsung’s HBM-PIM is now being tested inside AI accelerators by leading AI solution partners, with all validations expected to be completed within the first half of this year, the company said.

The post Samsung Puts Intelligence into High-Bandwidth Memory appeared first on eWEEK.

]]>
Top ETL, Data Integration Tool Vendors https://www.eweek.com/news/top-etl-data-integration-tool-vendors/ Tue, 02 Mar 2021 20:40:13 +0000 https://www.eweek.com/?p=218445 All data is created in one place and moved–sometimes very often–from one storehouse to another. ETL (extract, transform and load) and data integration have always been among the thorniest problems in all of IT to solve efficiently. This software aids in copying or moving data from one database or repository to another and ensuring that […]

The post Top ETL, Data Integration Tool Vendors appeared first on eWEEK.

]]>
All data is created in one place and moved–sometimes very often–from one storehouse to another. ETL (extract, transform and load) and data integration have always been among the thorniest problems in all of IT to solve efficiently. This software aids in copying or moving data from one database or repository to another and ensuring that the data is formatted correctly for the task at hand. ETL enables putting data to work and maximizing its value.

Generally, the ETL process has worked well, and it has been improved over time. Today, ETL, along with its brother Extract, Load, Transform (ELT), is used within increasingly complex data frameworks, including edge computing, the internet of things, connected supply chains, cloud environments and others. As enterprises move to more advanced business intelligence and data analytics—including systems that rely on machine learning and artificial intelligence (AI), ETL functionality is crucial.

Deploying the right ETL or data integration tool is very important to the success of an IT system. It can speed data processing, provide new ways to link and use data, and trim costs and time related to manual data management processes.

Here are some of the top ETL/data integration vendors in terms of market share, along with a look at their product and how it fits into the overall ETL marketplace. eWEEK used several industry sources to assemble this list. These include: eWEEK reporting, Technology Advice, G2 Crowd, IT Central Station, Gartner Peer Insights, TrustRadius and Crunchbase.

Amazon Web Services (AWS)

Seattle, Wash.

Value Proposition for Buyers: AWS, which owns a whopping one-third of the global web services business, remains the undisputed heavyweight of cloud computing service providers. Thus it’s not a surprise that it also offers a slate of products that connect legacy systems already using ETL to the cloud. These services include: AWS Import/Export Snowball, which offers petabyte-scale data transport; AWS Glue, a dedicated managed ETL service; AWS Database Migration Service, which is designed to move entire databases; and AWS Data Pipeline, which transports data across AWS computing environments along with on-premises systems.

Key values/differentiators:

  • AWS services offer numerous capabilities, robust data management consoles and features that line-of-business employees can use to build analytics capabilities from various AWS platforms and engines. These include Redshift, S3 (Simple Storage Service) and virtual private cloud (Amazon VPC) as well as from legacy mainframes and other systems. The latter include tools from Alteryx, Informatica and Matillion. AWS also offers ELT, which pushes transformation into the database.
  • AWS offers a broad menu of tools designed to address data management challenges involving the cloud. These range from straightforward ETL to more software that aids in moving massive amounts of data in an efficient and cost-effective way. AWS places a heavy focus on data integrity and data security.

Informatica

Redwood City, Calif.

Value Proposition for Buyers: Informatica has long established itself as a leader in the data integration business, consistently ranking among the top vendors for data management and ETL. Its platform supports virtually all forms of data migration and transformation, including with AWS, Azure and other leading platforms and tools. It delivers a high level of automation and data validation across development, testing and production environments. The vendor has earned a Gartner Customers’ Choice award for a few years in succession.

Key values/differentiators:

  • Informatica supports multi-cloud, on-premises and hybrid data integration in real time and batch modes. In addition, Informatica supports all major data formats and structures through native connectors. This includes industry specific formats such as SWIFT, HL7 and EDI X12.
  • Informatica PowerCenter supports data management and integration across its lifecycle. It includes strong support for security and regulatory requirements. This includes non-relational data.
  • The platform also supports grid computing, distributed processing, high availability, dynamic partitioning, pushdown optimization and adaptive load balancing. This produces a highly scalable and stable environment.

Pentaho (Hitachi Vantara Corp.)

Santa Clara, Calif.

Value Proposition for Buyers: Pentaho Data Integration (PDI) is well-respected for strong and reliable functionality delivered in a no-code process, which is the wave of the future. The tool handles data ingestion, blending, cleansing and preparation within a visual drag-and-drop environment. Pentaho works with all data types and formats and includes a powerful metadata injection feature that manages enterprise data at scale. It also includes a large library of pre-built components and delivers powerful orchestration capabilities that aid in coordinating and combining data.

Key values/differentiators:

  • Pentaho addresses big data integration with a zero-coding approach. The platform aims to eliminate manual programming and scripting. It also allows users to switch between execution engines, such as Apache Spark and Pentaho, and it supports Hadoop distributions, Spark and objects stored in NoSQL. This allows the platform to perform real-time data ingestion and tap IoT protocols.
  • PDI includes pre-built templates and it supports spot checks while data is in-flight, which aids in validation. It also delivers powerful orchestration capabilities along with notifications and alerts, and it includes an enterprise scheduler that coordinates workflows. In addition, the application ingests nearly any relational database, open source database, and file format. It connects to major business applications such as Salesforce and Google Analytics.

Devart

Prague, Czech Republic

Value Proposition for Buyers: Devart’s ETL toolset, Skyvia, is a software-as-a-service data platform that also uses a no-code wizard-based integration approach. It’s designed for use with no special knowledge of ETL and data integration. The graphical interface includes an intuitive set of wizards, templates and editors that pull data into a cloud, where data manipulation takes place. The platform provides strong mapping tools and features along with powerful automation for bi-directional synchronization.

Key values/differentiators:

  • Skyvia builds reports and dashboards from almost any format, including SQL, CSV, FTP, SFTP, SQL Azure, Amazon RDS, Amazon S3, Dropbox, Box, Stripe, Oracle, Magento, G Suite, Google Drive, Dynamics CRM and Salesforce, to name a few. It also provides strong data export features, including strong filtering, the ability to export related object data and export scheduling.
  • The bi-directional synching capability means that all data handled by Skyvia is available for use in real-time. What’s more, the platform preserves source data relationships in the target so that it can import data without creating duplicates. This makes it ideal for use among different groups within an enterprise.

Fivetran

Oakland, Calif.

Value Proposition for Buyers: Fivetran, one of the newer-generation data integrators, focuses on complete data replication within a no-coding and zero-maintenance framework. It offers automated data connectors that work with virtually all major applications, database formats and file types. The vendor’s ELT approach includes strong security and regulatory compliance tools. The Fivetran platform connects various sources of data to a central data warehouse in order to provide a holistic view of an organization.

Key values/differentiators:

  • Fivetran delivers a straightforward and easy to use interface. The application’s use of a centralized data warehouse can simplify data management by automating processes and allowing enterprises to focus on BI and analytics tasks. Fivetran receives high marks for its willingness to work with customers and provide service and support.
  • Fivetran features a robust and extensive set of connectors for virtually every application or data format. These include platforms as diverse as Salesforce, Oracle, Zendesk, Shopify, Hubspot, Stripe, Zero, Marketo, Mailchimp, Github, Workdays and FTP. The vendor supports quick and easy setup with maintenance-free data pipelines.

Microsoft

Redmond, Wash.

Value Proposition for Buyers: Microsoft isn’t the world’s most well-known data integrator, but it has well-respected tools in this category–both for the cloud and traditional data structures residing in SQL Server and database. Azure Data Factory is a hybrid data integration service that operates in a no-code environment. It extracts data from heterogeneous data sources and transforms them into cloud-scale repositories. The platform offers strong data mapping capabilities and includes tools for connecting the data to virtually any BI or analytics tool. SQL Integrated Services (SSIS) uses a drag-and-drop interface and strong data transformation capabilities to import data and integrate it with numerous software tools and platforms, including Salesforce.

Key values/differentiators:

  • Azure Data Factory extracts data from numerous data sources, including SSIS. It offers connectors to more than 80 external data sources (including AWS, Cassandra, DB2, and numerous Azure repositories). Data Factory accommodates both cloud and on-premises data while delivering enterprise-grade security. The platform supports both codeless UI and the ability to write custom code.
  • SSIS operates in a graphic environment and tackles enterprise grade data extraction, data cleansing and data transformation tasks. It offers import/export wizards to simplify data movement and it includes built-in scripting. The platform features a Services Catalog database that makes it easy to store, run and manage packages. In addition, SSIS can automate the maintenance of a SQL Server database. The platform received a Gartner Customers’ Choice 2018 award for Data Integration Tools.

Oracle

Redwood City, Calif.

Value Proposition for Buyers: Oracle’s plethora of databases in numerous vertical segments positions the vendor as a natural choice for many organizations. Oracle Data Integrator (ODI) offers a graphical interface that enables users to build and manage data integration in the cloud. It is designed for larger enterprises with significant data migration needs. ODI supports a declarative design approach and includes automation tools. An ELT architecture eliminates the need for an ETL server, something that can simplify tasks and reduce costs.

Key values/differentiators:

  • Oracle ODI is designed to serve as a comprehensive data integration platform that addresses the gamut of an organization’s data management needs. It works with major databases such as IBM DB2, Teradata, Sybase, Netezza, and Exadata as well as open source Hadoop. ODI taps existing RDBMS capabilities to integrate with other Oracle products for processing and transforming data.
  • ODI is designed to reduce data movement in the cloud. It achieves this capability partly by tackling ELT and ETL directly where the data resides instead of making copies of data to remote locations. It also aims to eliminate hand coding through robust mapping capabilities.

SAP HANA

Walldorf, Germany

Value Proposition for Buyers: SAP’s BusinessObjects Data Integrator handles large-scale data migrations, integrations and ETL. It takes aim at the challenges of moving large volumes of data between on-premises and legacy systems and the cloud. The software offers a graphical interface, powerful connectors and tools to support extreme extraction, transformation, and load (ETL) scalability. All of this delivers impressive flexibility and scalability through prebuilt data models, transformation logic, and data flows.

Key values/differentiators:

  • BusinessObjects Data Integrator is built into SAPs Rapid Marts, which offer powerful ETL features optimized for reporting and end-user query and analysis. The platform can extract data from numerous enterprise systems, including SAP R/3, Siebel, Oracle, PeopleSoft, and J.D. Edwards applications.
  • Data Integrator Designer offers a single tool for performing all tasks related to building, testing, and managing an ETL job. This includes: managing projects; profiling data; creating ETL jobs; cleansing, validating, and auditing data; setting parallel job execution; building workflows; and testing, debugging, and monitoring ETL jobs.

SAS

Carey, N.C.

Value Proposition for Buyers: SAS has long been a major player in the world of BI and analytics, but it also offers products designed to handle virtually any data-related task. Its Data Integration Studio serves as a premier ETL product for linking data within SAS applications and beyond. The visual design tool can pull data from almost any source and, using powerful tools and logic, integrate it with analytics software. It delivers powerful and easy-to use capabilities designed for multi-user environments.

Key values/differentiators:

  • SAS Data Integration Studio migrates, synchronizes, and replicates data among different operational systems and data sources. It alters, reformats, and consolidates data as required. Real-time data quality integration cleanses data as it is being moved, replicated, or synchronized. Users can build and apply reusable business rules.
  • DIS lets users query and use data across multiple systems without the physical movement of source data. SAS Data Integration provides virtual access to database structures, ERP applications, legacy files, text, XML, message queues, and many other sources. This allows users join data across virtual data sources for real-time access and analysis. The resulting semantic business metadata layer reduces data complexity.

Talend

Los Altos, Calif.

Value Proposition for Buyers: Talend has a strong reputation among data management tool providers. The company offers three primary products aimed at ETL and related tasks: Talend Enterprise Data Integration, Talend Platform for Big Data Integration and Talend Open Studio for Data Integration. All three products landed on Gartner’s Customers’ Choice 2018 list. The vendor’s products have a reputation for speed and performance, flexibility and scalability, and ease of use.

Key values/differentiators:

  • Transforming, moving and synchronizing data across heterogeneous sources and targets is at the center of Talend’s product offerings. The vendor offers highly flexible tools that work with cloud services such as AWS, Azure and Google as well as enterprise apps like Salesforce, Dropbox and Box—using ETL, ELT, batch and real-time processing.
  • Talend offers a robust array of features within a graphical interface. This includes team collaboration features, continuous integration and delivery, visual mapping, data governance and security features, including fraud pattern detection and advanced matching and statistics analysis.
  • Users like the company’s clear vision, roadmap and communities that operate within an open source framework. They also praise powerful capabilities at a lower cost than competitors.

Xplenty

San Francisco, Calif.

Value Proposition for Buyers: Xplenty provides a cloud data deliver platform that integrates numerous data stores, applications and other data sources. In most regions, the SaaS ETL platform can run on AWS, Google Cloud or the vendor’s own public or private cloud. The vendor is known for delivering a highly flexible, scalable and secure platform for managing nearly any type of data workload. It offers a broad set of APIs.

Key values/differentiators:

  • Users give the vendor high marks for ease of use, flexibility and features. They also praise Xplenty for its service and support.
  • The Xplenty platform uses native connectors to support more than 100 data stores and SaaS applications, including Facebook, Salesforce, AWS, Google Cloud, Microsoft Azure, Magento and Slack.
  • Xplenty uses a package designer to implement a broad array of data integration use cases. The graphical point-and-click interface allows users to manage data without coding. The platform executes packages directly from the user interface or from an API. This approach simplifies automation, scheduling, job monitoring, status reports and other orchestration information.

Go here to find other eWEEK Top Vendors articles.

The post Top ETL, Data Integration Tool Vendors appeared first on eWEEK.

]]>