In one of our previous posts, we highlighted the importance that making security a part of your organizational culture played in keeping your remote workforce secure during the COVID-19 pandemic. But what does that entail? In this post, we’re going to flesh out key steps that security teams and their leadership should take in order to make a strong culture of security a reality within their organizations.

1. Security culture is inseparable from the values of your organization’s leadership

Like any other organizational value, building a culture of security starts at the top. Invested stakeholders, usually starting with senior leadership, must cascade the types of cultural changes they wish to see by helping spearhead initiatives that will ultimately transform their organization. Although it is IT’s job to educate and engage with employees who break security policies and don’t follow security best practices, it would be very difficult for IT to function in an organization where leadership doesn’t embody the values needed to maintain a secure organization.

While security teams and leadership have historically talked past one another, there is a growing understanding that leadership must play a role in fostering a culture of security by investing in security teams and setting the expectation that security is taken seriously across the entirety of the organization. Luckily, a growing number of security teams have found a common language to discuss these issues with the board and C-level executives – the language of business risk assessment and security performance benchmarking. When security leaders and business leaders speak the same language, it’s then that business leaders will begin to understand their role in shaping their organization’s security posture. This will motivate them to enshrine security as one of the organization’s core values and enable processes like best practices documentation and security education programs to play a critical role in employee onboarding and training.

With this in mind, it might be challenging for organizations whose leaders don’t already appreciate the importance of security to adapt to the security challenges of remote work. Assuming these processes are in place within your organization, now is the time to update them to appropriately reflect the risks remote employees may encounter while working from home. However, if such processes are not in place, implementing them will obviously be a critical goal going forward.

2. Employees must be made aware of how important security is to the organization and how it impacts their work

Whether or not your organization has training and documentation in place, it’s a good idea to reiterate the significance of security best practices to employees through company-wide communications channels and remote events like security discussions and training. This is especially true given that many employees are adopting new technologies to work and collaborate remotely while facing new and emerging types of malware and social engineering. Your aim as you educate employees is to remind them that security is critical to the health of the organization and that the security risks they face effectively translate to job performance. Ultimately, an employee affected by a security incident will be unable to perform their duties making it very important for them to broadly grasp the types of cyber threats the organization faces.

3. As you educate employees tie it into personal learning

good security education program effectively serves a workforce development function. Getting employees to see this will improve employee buy-in and make them more readily embrace security education. In addition to the previous point of tying security education to organizational health and improved job performance, you should also highlight that security education will make employees good digital citizens which will help them in their personal life and in future roles. To reflect this mindset, security teams should whenever applicable highlight when security lessons apply both on the job and off the job.

4. Encourage employees to apply what they’ve learned

Building and revamping security education programs for the remote work era is only half the battle. Getting employees to apply what they’ve learned by identifying and potentially stopping incidents is the ultimate goal. Comprehensive security education programs should often be paired with periodic simulations (like phishing tests) where employees can demonstrate their security savvy. Employees and departments that are successful in identifying real or simulated incidents should be recognized for doing so during performance reviews and evaluations.

5. Build a security resource library

Most of this post has focused on the nature of security education and awareness programs; however, documentation is an important resource for employees as well. Good onboarding documentation, like your employee handbook, is critical to setting the expectation that security is important. However, your organization should more generally provide other documentation. In most cases, this will take the form of a security resource library which should contain plain language summaries of company security policies, as well as descriptions of cyber risks relevant to your company. You might also choose to include learnings from previous security training in the form of videos or other interactive content. Finally, you’ll want to ensure you’ve assigned a stakeholder to maintain this library and encourage employees to review it periodically so that they can stay up to date on what they need to know to stay secure.

If you already have such a resource, it’ll naturally be a great channel to provide employees with the lessons they’ll need to stay safe while working remotely. If not, it’s not too late to build one. You might find that some of your existing security content can readily be turned into materials to give remote employees the security insights they’ll need as they navigate the security risks of remote work.


This article was originally published at nightfall.ai


Featured Image Credits: Pixabay

Remember tokenized securities or securitization with tokens on the blockchain?

With the entire year in crypto defined by a maelstrom of projects embarking on decentralized finance (DeFi) aspects to their products, it can be easy to forget that previous advancements in blockchain-based technologies have continued to make great headway in terms of adoption and application.

Security tokens and tokenized securities 

In 2019 especially, with greater regulatory scrutiny on blockchain-based crowdfunding in the shape of initial coin offerings (ICOs), many projects sought to reconcile crypto’s much-maligned aspect of democratic fundraising with increasingly unforgiving regulatory compliance. Hence the proliferation of Security Token Offerings (STOs) that meant to replace ICOs as legitimate, law-abiding instruments to raise funds and issue securities through blockchain-based tokens.

It’s important here to distinguish between security tokens and tokenized securities — often used interchangeably, but hardly the same thing. In the former, blockchain technology is used to create new tokens that are a representation of real-world “securities”, ie. crypto assets that share some qualities as securities in the traditional sense. In the latter, we are talking about existing assets (securities) in the real world, that is expressed digitally… wrapped, if you will, in a token technology.

An overlooked breakthrough

Put in another way, security tokens create a token and create securities, but tokenized securities simply digitalize existing securities. That really is something that solves a major problem with traditional securities, which makes it somewhat surprising that it hasn’t been picked up more.

Tokenizing securities immediately helps with widening the market and improving their liquidity. In addition, it’s not a new product so it isn’t so much something for regulators to look at, it simply is a new, digital channel for distribution, which actually makes tokenized securities simpler to approve.

They’re not just an idea, they’re already here.

Because tokenizing securities are comparatively simple to do, there actually have been quite a number of them entering the market. Last year, we saw traditional funds as 22X Fund put together a tokenized fund (with money raised through an ICO in fact in 2018) to invest in 22 startups. But SPiCE will argue it was even earlier, as the VC fund set up in 2017 and lays claim to being the first tokenized VC fundable to offer immediate liquidity for venture capital — which otherwise takes years to liquidate!

This year, AllianceBlock, which is building the “world’s first globally compliant decentralized capital market” partnered with another blockchain firm AIKON for a secure blockchain-based identity management service — making decentralized finance services accessible to all, and securing that access with the blockchain.

The data already shows that the coming years will see securities very soon fully digitized and empowered by blockchain. From owning a small share in your favorite soccer club to fractional ownership of pizza restaurants in a country halfway around the world from you, using blockchain for authentication is spelling out a way for $256 trillion worth of real-world assets, mostly illiquid as physical representations, to go digital.

As they say in blockchain, tokenized securities are a matter of when not if.


This article was originally published at aikon.com


Featured Image Credits: Pixabay

There’s every indication that the pandemic is changing the nature of cybersecurity. Online threats are evolving to match our new remote-work paradigm, with 91% of businesses reporting an increase in cyber attacks during the coronavirus outbreak.

Hackers are getting more and more sophisticated and targeted in their attacks. Many of these cyber threats have been around for a while, but they are becoming harder for the average user to detect. Beware of these four common types of cyber threats – and learn what you can do to prevent them.

Advanced phishing attacks

Phishing takes place when a hacker tricks an individual into handing over information or exposing sensitive data using a link (with hidden malware) or a false email. These types of security threats are quite common, but in recent months they are becoming even more advanced.

Microsoft’s recent survey of business leaders in four countries found that phishing threats are currently the biggest risk to security. Since March, 90% of those polled said that phishing attacks have impacted their organization, and 28% admitted that attackers had successfully phished their users. Recently, phishing emails have targeted enterprises to capture personal data and financial information using one of the following tactics:

  • Posing as a provider of information about COVID-19 vaccines, PPE, and other health and sanitation supplies
  • Creating false “portals” for business owners to apply for government assistance and stimulus funds during the economic shutdown
  • Using download links for platforms and tools that help remote teams communicate, such as video conferencing
  • Posing as “critical update” downloads for enterprise collaboration solutions, such as Microsoft OneDrive, and social media applications
  • Targeting IT service providers that ask for payment in order to provide tech support.

Phishing is so effective because it can be very hard to recognize and targets individual people, rather than IT vulnerabilities. Yet, they are still ways to lower your risk of phishing.

How to prevent phishing: The best chance to prevent phishing attacks is to educate your teams on what to look for in a phishing message. Poor spelling and grammar, as well as an email address that doesn’t match the user, are telling signs of a phishing message. If an offer seems too good to be true, it is a good sign you’re being scammed.  In addition to user education, you can add multi-factor authentication and other interventions to stop phishing messages from getting through. “Spam filters with sandboxing and DNS filtering are also essential security layers because they keep malicious emails from entering the network, and protect the user if they fall for the phishing attempt and end up clicking on a malicious hyperlink,” said one security expert told ZDNet.

Ransomware

Ransomware is a type of security threat that encrypts a victim’s files so they can’t access their information. The hacker then asks for a ransom – usually payment – to restore access and decrypt the user’s data.

Perhaps the most notorious recent example of a ransomware attack is that of Garmin. In July, Garmin – a navigation and fitness wearables company – was hit by a ransomware attack that downed service for virtually every Garmin customer.  “Hackers deployed the ransomware tool WastedLocker, which encrypts key data on a company’s digital infrastructure,” reported Cyber Security Hub. “In the case of Garmin, website functions, customer support, and user applications were all affected. Unlike typical ransomware software, WastedLocker does not steal identifying information and holds it for ransom. Instead, it renders programs useless until decrypted.” Garmin reportedly paid $10 million for the decryption key to resume services after four days of outages.

Garmin isn’t alone, however. There’s been a seven-fold increase in ransomware attacks this year targeting companies of all sizes. So, what can your organization do to protect itself?

How to prevent ransomware: First and foremost, it’s important to make sure your security protocols are kept airtight – and apply security patches as quickly as possible to prevent hackers from exploiting vulnerabilities. A tool like Nightfall can make it easier to maintain a strong defense, with AI monitoring your network for any issues. Multi-factor authentication can also prevent hackers from getting too far into your system. And, you should regularly back up your system so if a cyber ransomware attack does happen, you’ll be able to recover some data.

Password-based cyber attacks

password-based cyberattack is one that targets users who have the same password for multiple sites. Research from the World Economic Forum found that 4 out of 5 global data breaches are caused by weak/stolen passwords.

There are several different ways a hacker can infiltrate your system using a password-based cyberattack. The most common method is known as a brute force attack. This attack uses a computer program to try to login to a user’s account by trying all possible password combinations, starting with the most common and easiest to guess options – for instance, “1234” or “abcde”.  Sensitive data like passwords, credentials and secrets are in constant danger of exposure, especially as more companies conduct the majority of their business in the cloud. The highly collaborative and always-on nature of cloud services makes it hard to enforce good password practices. Therefore, organizations need data loss prevention (DLP) to secure essential data from being exposed.

How to prevent a password-based attack: make it easy for users and security teams alike to circumvent the risk of password attacks by implementing password-free authentication methods. This is a type of authentication that requires a user to confirm their identity during the login process through a separate channel. This extra step can also protect your workspace in case there’s any account compromised or if a device gets stolen.

IoT and smart medical devices 

The internet of things makes life a lot easier – and also more open to bad actors. Connected devices are an increasingly popular target for cyber threats. In 2019, cyber attacks on IoT devices increased by 300%, according to one report. This includes attacks on everything from laptops and webcams to smart homes (like Google Nest), smartwatches, routers, and other home appliances.

Our personal devices aren’t the only things that are vulnerable. The Software Engineering Institute of Carnegie Mellon University reported, “As more devices are connected to hospital and clinic networks, patient data and information will be increasingly vulnerable. Even more concerning is the risk of remote compromise of a device directly connected to a patient. An attacker could theoretically increase or decrease dosages, send electrical signals to a patient or disable vital sign monitoring.” Healthcare providers must also contend with protecting patient data. As many healthcare providers shift to remote work, they become an attractive target for hackers. Protected health information (PHI) must be kept safe during all cloud-based activities – yet many SaaS providers, including Slack, are not HIPAA-compliant right out of the box.

How to prevent IoT attacks: IoT attacks are sophisticated, and the best ways to protect your devices are to use strong passwords and keep your software up to date. Experts also suggest keeping your devices unlinked from social media.  Along with protecting your devices, look for a DLP partner who can protect your patient data while working on SaaS and IaaS platforms. Check out our coverage of instituting and maintaining HIPAA compliance on Slack and schedule a meeting below to learn more about how tools like Nightfall DLP play a role in keeping PHI safe.


This article was originally published at nightfall.ia

Featured Image Credits: Pixabay

One of the most interesting trends surfacing in the crypto industry today is the increasing likelihood of Bitcoin emerging as the next global reserve currency – something that Bitcoin fundamentalists have been preaching for the last decade. 

With the combination of transparency and decentralized trust brought on by the blockchain, individuals and companies across the world have had the opportunity to participate in a free financial system since the emergence of Bitcoin some twelve years ago. 

Since the dawn of Blockchain, trust in this trustless system has been slowly rising with a diverse range of individuals, institutional investors, and even world governments investing in the technology and the various tokens in circulation today. One result of this has been the free flow of liquidity across borders in a remarkably revolutionary way – satisfying the ever-growing need for a more efficient global financial system. 

Mr. Yoon Kim is an accomplished and dynamic crypto analyst and strategist. He successfully built the TMT sector of Tremblant Capital and helped the company increase its AUM from $200 million to $5 billion in five-years’ time. He then launched Vestry Capital, a global TMT equity fund as the head of which he served as an advisor and consultant to various hedge funds and blockchain projects.

With his 20 years of experience in investing and in the blockchain industry, Mr. Kim acutely understands these shifts in the global financial system. 

For that reason, one of the key topics of conversation during The New Normal of Blockchain & Cryptocurrency panel which AIKON organized in late October was “where the future lies for the USD and its long-term position as the world’s reserve currency”. 

Mr. Kim indicated that the USD losing some of its standing in the global financial system and possibly its status as the reserve currency as an inevitable product of blockchain’s accessibility and decentralization. 

“The timing is very auspicious […] it becomes rational and logical for a lot of people to push Bitcoin as a reserve currency” – Yoon Kim

Yoon Kim

As Mr. Kim has pointed out, the current financial system has been in place since World War II – 75 years now! On average, global financial systems have typically lasted for ~70-80 years each. We are, then, coming to the end of an era and can stand with bated breath awaiting the next financial revolution. 

Moreover, history has shown that significant global events often precede the breakdown of institutionalized financial systems. For the Pax Britannica, it was World War I. For the global financial system, we have today, it may very well be the impact of COVID-19 on the world economy. 

Having been a staple of the global economy, and considering the turmoil, the US has endured throughout 2020, USD is in serious danger of being dislodged from the position of power it has enjoyed over the last three-quarters of the 21st century. 

“The prevailing global systems of finance, trade [and] economic activity [have been around for] 70 to 80 years” – Yoon Kim

Given the amount of influence that US politics now has on the rest of the world, and being mindful that the level of engagement that USD (as a global reserve currency) will have on the rest of the world after the presidential election will probably never reach the levels from 40 – 50 years ago when it was at its peak. With the decrease in the level of engagement of the US with the world economy after the Soviet Union’s dissolution, what we see now are the effects of the politics that took 20 years to materialize. 

In that sense, Mr. Kim pointed out that it is very probable that USD is about to be dethroned as the most important currency in the world. 

And while there are those who would like to see the Chinese RMB take its place, Mr. Kim considers this very unlikely to happen. For one, dethroning USD from the position of the global reserve currency would put a significant amount of pressure and responsibility on the Chinese financial system, responsibilities the country seems to be shunning presently. For instance, China has been accused of intentionally increasing demand which then leads to an increase in the prices of international commodities. 

Therefore, the question is what will supplant USD as the global reserve currency or at least become an alternate reserve currency running in parallel with USD?

Mr. Kim stated that Bitcoin seems to fit perfectly, especially taking into account the timing of its rise, as well as its ability to cross borders with very little effort. 

As political and economic relations between the US and China continue to collapse, it is becoming increasingly unlikely that either the USD or RMB will be viewed as a viable global reserve currency going forward. 

Bitcoin may prove to be the thing that both nations, as well as the rest of the world, decide they can live within the upcoming decades. 

“BTC […] will become a reserve currency that stands aside and is not controlled by a single nation” – Yoon Kim

While the Chinese government is actively restricting crypto trades, there is massive support within the government for cryptocurrencies and blockchain. This implies that they have a long-term strategy in place, where Bitcoin would be used to dislodge the USD as the global reserve currency. 

In the same way, we’re seeing the causality of the US global economics politics conducted in the past 20 years and its effect on the situation now, there is a good chance that 20 years from now we will have Bitcoin as the reserve currency of the world simply because it will not be controlled by any one nation and its financial system. 

Should Mr. Kim’s predictions come to be realized, individual and corporate players in this new market that is quickly gaining momentum should be preparing for the shift.


This article was originally published at Aikon.com


Featured Image Credits: Pixabay

Unstructured data is projected to account for approximately 80% of the data that enterprises will process on a daily basis by 2025. Data breaches and other security issues get a lot of attention in the media, but all businesses working with data, especially data in the cloud, are at risk of data loss. Preventing data loss can be difficult for a number of reasons.

IDG projects that by 2026, there will be 163 zettabytes of data in the world. To put that in context, one zettabyte is equal to a thousand exabytes, a billion terabytes, or a trillion gigabytes. The astronomical amount of data transmitting, living, and working in the cloud is just one of the complications that make securing data a tough task for businesses to manage. Of all the unstructured data in the world, most of it goes completely unused. According to industry analysts IDC, more than 90% of unstructured data is never examined. This means large portions of data float around unsecured and underutilized for many businesses.

That’s why it’s important to understand where unstructured data comes from, why it’s so hard to pin down, the risks of not securing unstructured data, and the rewards of bringing that data into a structured environment.

Hiding in plain sight

Unstructured data can come from almost any source. Nearly every asset or piece of content created or shared by a device in the cloud carries unstructured data. This can include:

  • Product demo videos on your website
  • QR codes for discounts and deals on an e-commerce app
  • Podcasts and other audio blogging files hosted on your website’s blog page
  • Social media messages on platforms like Facebook, Twitter, and LinkedIn

Internal communications and collaboration platforms are major sources of unstructured data. Think Slack, Confluence, and other SaaS applications where many people do their daily work and communicate with colleagues. Most cloud-based applications like these allow unstructured data to pass through massive networks to be shared, copied, accessed and stored unprotected.

IDG Communications published an article written by then-Pitney Bowes Software Vice President Andy Berry in 2018. Berry commented on how the modern workplace approaches data and why these norms contribute to the data loss problem, citing one study that found enterprises using almost 500 unique business applications. SaaS applications generate data that can quickly become obsolete, unusable, and eventually inaccessible.

Data powers everything we do in our professional and personal lives, but with little to no oversight on data hygiene, we often miss out on key opportunities to improve security blindspots and maximize data performance.

A complex problem

The various sources of unstructured data show how complex data loss can be. Many problems with DLP start with the three V’s of data — volume, velocity, and variety. It’s hard for humans and manual review to keep up with the staggering amount of data, speed of data proliferation, and the many different sources of data.

Adding to the problem is the fact that unstructured data is very difficult to organize. It’s impossible to dump every piece of unstructured information into a database or spreadsheet, because that data comes from myriad different sources and likely doesn’t follow similar formatting rules. On top of that, finding unstructured data through manual processes would take more time than there are hours in the day. It’s not a job for humans.

Other roadblocks to unstructured data collection include increasingly stringent privacy regimes, laws that protect intellectual property (IP) and other confidential or proprietary information like trade secrets, and businesses communicating across different security domains between the cloud and traditional hard-drive based storage systems. Information security is evolving at lightning speeds, but some schools of thought are still based on older priorities that focus on preventing outsider threats. It’s important to protect an organization from malicious actors, but what about good-natured, everyday workers who don’t know what they don’t know? That can still hurt an organization in tremendous ways.

Unstructured data isn’t all bad news. It can also be an opportunity for organizations that can recognize two main ideas. First, this data must be gathered, protected, and understood. Second, that there’s value in all the data that is currently going unused. Computer Weekly cited sources that estimate modern businesses are utilizing as little as 1% of their unstructured data.

Our world runs on data, and each person interacting with apps, platforms, and devices contributes to the growing data reserves. When organizations think about gathering data to help with marketing, business intelligence, and other key functions, they must also factor in the impact of unstructured data. Unstructured data presents equal risk and opportunity for business leaders. When that data lives in the darkness, its only impacts are negative. But when data is brought into the light, we can use that data to be smarter and better at work. 

Solving the unstructured data problem

Unstructured data is a major concern for organizations using cloud-based collaboration and communications platforms. Productivity relies on environments where co-workers can share ideas and messages quickly, without fear of exposing sensitive data. Nightfall, a data loss prevention (DLP) solution, provides much-needed security for today’s most used communications and collaboration platforms like Slack, Confluence, and many other popular SaaS & data infrastructure products.

Since these applications lack an internal DLP function, and each allows for the lightning-fast transmission of massive amounts of data, Nightfall’s machine learning-based platform is an essential partner for many organizations handling sensitive information like PII (personally identifiable information), PHI (protected health information), and other business-critical secrets. Nightfall’s three-step approach allows businesses to discover, classify, and protect unstructured data through artificial intelligence (AI) and machine learning (ML). Our solution makes sense of unstructured data, while traditional security solutions solely rely on users to help categorize data through methods like regular expressions (regex), which have limited accuracy in unstructured environments.

Each step of Nightfall’s ML solution is critical to the process of DLP. Discover means a continuous monitor of sensitive data that is flowing into and out of all the services you use. Classify means ML classifies your sensitive data & PII automatically, so nothing gets missed. Protect means businesses can set up automated workflows for quarantines, deletions, alerts, and more. These three arms of DLP save you time and keep your business safe — all with minimal manual process or review oversight from you or your staff.

Helping businesses identify and access unstructured data

Data is a part of life, especially as remote work becomes an essential function for productivity and collaboration. Business leaders must understand the risk of ignoring unstructured data and the value of making that data work for the business. It’s a tall order to identify and bring in a mass of unknown data to the cloud, but the rewards come with a better understanding of your organization, your industry, and your customers. Good things can come from unstructured data — as long as you’re ready to approach the issue with a solid data strategy and a knowledgeable DLP partner like Nightfall.


About Nightfall

Nightfall is the industry’s first cloud-native DLP platform that discovers, classifies, and protects data via machine learning. Nightfall is designed to work with popular SaaS applications like Slack & GitHub as well as IaaS platforms like AWS. You can schedule a demo with us below to see the Nightfall platform in action.

This article was originally published at Nightfall.ai


Featured Image Credits: Pixabay

In the world of computer science, there are many programming languages, and no single language is superior to another. In other words, each language is best suited to solve certain problems, and in fact there is often no one best language to choose for a given programming project. For this reason, it is important for students who wish to develop software or to solve interesting problems through code to have strong computer science fundamentals that will apply across any programming language.

Programming languages tend to share certain characteristics in how they function, for example in the way they deal with memory usage or how heavily they use objects. Students will start seeing these patterns as they are exposed to more languages. This article will focus primarily on Python versus Java, which is two of the most widely used programming languages in the world. While it is hard to measure exactly the rate at which each programming language is growing, these are two of the most popular programming languages used in the industry today.

One major difference between these two programming languages is that the first is dynamically typed, while Java is statically typed. Loosely, this means that Java is much more strict about how variables are defined and used in code. As a result, Java tends to be more verbose in its syntax, which is one of the reasons we recommend learning Python before Java for beginners. For example, here is how you would create a variable named numbers that holds the numbers 0 through 9 in Python:

numbers = []

for i in range(10):
numbers.append(i)

Here’s how you would do the same thing in Java:

ArrayList numbers = new ArrayList();

for (int i = 0; i < 10; i++) {
numbers.add(i);
}

Another major difference is that Java generally runs programs more quickly than Python, as it is a compiled language. This means that before a program is actually run, the compiler translates the Java code into machine-level code. By contrast, Python is an interpreted language, meaning there is no compile step.

Usage and Practicality

Historically, Java has been the more popular language in part due to its long legacy. However, Python is rapidly gaining ground. According to Github’s State of the Octoverse Report, it has recently surpassed Java as the most widely used programming language. As per the 2018 developer survey, Python is now the fastest-growing computer programming language.

Both Python and Java have large communities of developers to answer questions on websites like Stack Overflow. As you can see from Stack Overflow trendsPython surpassed Java in terms of the percentage of questions asked about it on Stack Overflow in 2017. At the time of writing, about 13% of the questions on Stack Overflow are tagged with Python, while about 8% are tagged with Java!

Web Development

Python and Java can both be used for backend web development. Typically developers will use the Django and Flask frameworks for Python and Spring for Java. Python is known for its code readability, meaning Python code is clean, readable, and concise. Python also has a large, comprehensive set of modules, packages, and libraries that exist beyond its standard library, developed by the community of Python enthusiasts. Java has a similar ecosystem, although perhaps to a lesser extent.

Mobile App Development

In terms of mobile app development, Java dominates the field, as it is the primary language used for building Android apps and games. Thanks to the aforementioned tailored libraries, developers have the option to write Android apps by leveraging robust frameworks and development tools built specifically for the operating system. Currently, Python is not used commonly for mobile development, although there are tools like Kivy and BeeWare that allow you to write code once and deploy apps across Windows, OS X, iOS, and Android.

Machine Learning and Big Data

Conversely, in the world of machine learning and data science, Python is the most popular language. Python is often used for big data, scientific computing, and artificial intelligence (A.I.) projects. The vast majority of data scientists and machine learning programmers opt for Python over Java while working on projects that involve sentiment analysis. At the same time, it is important to note that many machine learning programmers may choose to use Java while they work on projects related to network security, cyber attack prevention, and fraud detection.

Where to Start

When it comes to learning the foundations of programming, many studies have concluded that it is easier to learn Python over Java, due to Python’s simple and intuitive syntax, as seen in the earlier example. Java programs often have more boilerplate code – sections of code that have to be included in many places with little or no alteration – than Python. That being said, there are some notable advantages to Java, in particular, its speed as a compiled language. Learning both languages will give students exposure to two languages that lay their foundation on similar computer science concepts, yet differ in educational ways.

Overall, it is clear that both Python and Java are powerful programming languages in practice, and it would be advisable for any aspiring software developer to learn both languages proficiently. Programmers should compare Python and Java-based on the specific needs of each software development project, as opposed to simply learning the one language that they prefer. In short, neither language is superior to another, and programmers should aim to have both in their coding experience.

Python Java
Runtime Performance Winner
Ease of Learning Winner
Practical Agility Tie Tie
Mobile App Development Winner
Big Data Winner

This article by Andrea Domiter was originally published at junilearning.com

About the Author:

Andrea Domiter is pursuing a B.A. in Computer Science and Economics with a specialization in Data Science at the University of Chicago. She is currently an instructor at Juni Learning, teaching Python, Scratch, Java, and Pre-Algebra. Last summer, Andrea worked at RCP Advisors, a private equity firm based in Chicago, as a Research Analyst focusing on automating several processes. Andrea also loves to cook, hike, and read.


Featured Image Credits: Pixabay

In Texas, the energy industry plays an important role, particularly when it comes to green energy. Because of the prominence coal, oil, and renewable energy play in the Lone Star State, concerns over CO2 emission levels are equally important.

Burning fossil fuels and producing cement account for about two-thirds of all carbon dioxide (CO2) and industrial methane released into the atmosphere since 1854. Although the U.S. has cut more CO2 emissions than any other nation and is on pace to meet a 2009 pledge to reduce CO2 emissions by 17% (from 2005 levels) this year, global carbon dioxide emissions have still reached the highest point in human history.

The Trump administration dismantled Obama-era regulations that would have required power producers to slash CO2 emissions 32 percent below 2005 levels by 2030. China is the biggest contributor to greenhouse gases (by a large margin). The United States comes in second.

Union of concerned scientists

Image Credits

Coronavirus pandemic affects COemissions

The impact on energy use and CO2 emissions due to the coronavirus pandemic has had major implications on global economies. In the first quarter of 2020, while many countries remained in full or partial lockdown, energy demand declined by 3.8 percent.

The hardest-hit industries include:

  • Coal. Global demand for coal fell by almost eight percent, compared to the same time in 2019. Low-priced gas and the continued growth in renewables globally, as well as mild weather across the U.S., capped coal use.
  • Oil. The demand for oil was down almost five percent in the first quarter of 2020. This was mainly due to shelter-in-place orders and reduced air travel due to COVID-19. Since air travel accounts for nearly 60 percent of oil demand globally, the impact on the demand for oil was significant.
  • Gas. Although not impacted to the same degree as coal or oil, gas still saw a two percent reduction in demand in the first quarter of 2020.
  • Electricity. Experts estimate the demand for electricity since the COVID-19 lockdown has decreased by about 20 percent. However, residential demand for electricity actually saw an increase and far outweighed the reduction in commercial and industrial operations as businesses remained closed.
  • Renewables.  This is the only energy source that saw an uptick in demand.

Energy companies step up to address climate change

Every year the Center for Climate and Energy Solutions (C2ES) addresses how the industry impacts changing weather patterns and greenhouse gas emissions. An increase in droughts, wildfires, and hurricanes, climbing temperatures, and rising sea levels have energy companies scrambling to address the consequences of climate change on weather patterns and the environment.

However, in the past six months, climate change has taken a backseat to COVID-19-related conversation. Even so, according to the Oil and Gas Climate Initiative, nearly a dozen energy companies world-wide have agreed to cut the output of emissions by 36 million to 52 million tonnes (a metric unit of mass equal to 1,000 kilograms) per year by 2025.

Energy-related CO2 Emissions from Industry, 2019

Center for climate and energy solutions

Image Credits

How are COemissions produced?

Industries produce products and raw materials for use every day. The greenhouse gas emissions that industries emit are split into two categories, direct emissions and indirect emissions. The emissions come from the use of machines, computers, processing raw materials, heating and cooling buildings, use of petroleum in production, chemical reactions, and more.

  • Direct emissions are produced on-site at the facility
  • Indirect emissions are produced off-site and result from a facility using energy.

It’s difficult to weigh the cost to reduce greenhouse gasses to companies over time. Obviously, the long-term gains to the environment will far outweigh short-term expenses. There is no economy-wide tax on carbon. Instead, greenhouse gas mitigation policies provide subsidies aimed at certain technologies, like solar and wind generation and biofuels.

The role of renewable energy

Although all sources of energy have an impact on the environment, renewable energy – solar, wind, hydroelectric, geothermal and biomass – has substantially less. However, that’s not to say that renewable energy has no environmental impact.

Wind.  Wind power produces no global warming emissions or toxic pollutants. However, wind power can impact wildlife, birds, and natural habitats.  Land use and copper consumption can also cause issues for the environment.

Solar. Solar power produces electricity from the sun, which is cost-effective and leaves little impact on the environment. However, it can have an impact on greenhouse emissions with the use of hazardous materials during manufacture.

Geothermal. Geothermal plants use technology to convert resources from deep within the earth’s crust to electricity. Depending on the technology used, it can affect emission levels in the air.

Biomass. Both biomass power plants and fossil fuel power plants use the combustion of feedstock, like agricultural waste, forest products, and manure to generate electricity. How the biomass is generated and harvested can affect land use and add to global warming.

What can you do?

While you may not be able to influence large companies to change manufacturing processes, there are a few things you can do to stamp out even a small portion of greenhouse gases and CO2 emissions.

  1. Use your own reusable bottle or cup for water or coffee.
  2. Replace efficient bulbs in your home.
  3. Keep your thermostat a few degrees warmer or cooler.
  4. Recycle.
  5. Turn off the lights when you leave the room.
  6. Walk or bike to work.
  7. Don’t select one-day shipping unless necessary.
  8. Get outdoors, but pick up your litter.
  9. Use the SaveOnEnergy marketplace to find and compare renewable energy plans and rates available in your area.

This article by the Save On Energy Team was originally published at saveonenergy.com

Author Info:

Kathryn Pomroy is a freelance journalist from Minnesota who has written for dozens of major publications, magazines, and many well-known person finance companies. She is also knowledgeable in energy-related topics like renewable energy, climate change and greenhouse emissions. Kathryn holds a BA in Journalism.


Featured Image Credits: Pixabay

Managing successful IT projects with Agile.

While the application of project management is not new, the advent of project management ‘approaches’ or philosophies has led to significant improvements in the quality of projects delivered and the overall project management process itself. Gone are the days where an excel spreadsheet, work in progress (WIP) meetings and reliance on Microsoft Office programs were sufficient IT project management tools. Now we are spoilt for choice with an increasing number of methodologies available to keep even the most complex of projects on track and within budget, while allowing for improvements to be made that may sit outside the original scope. 

But when it comes to choosing the right project management methodology for your business there are many factors that come into play – such as the type of work you produce and the industry you (or your customer) are in. Today, we want to focus on one such methodology that has been critical in the successful management of technology projects: Agile.

Meet Agile.

An IT project is rarely touched by just one person. From the project manager overseeing the workflow through to the back-end developer, a collaborative approach is key – and with collaboration comes open communication and sharing of ideas. A fixed, traditional project management approach relies on the initial scope to be delivered and the budget and timeline will rest on this. But what if an unexpected idea pops up that may result in a more successful product – yet the timeline or budget doesn’t allow for it? Scope creep sets in or (potentially worse) the idea doesn’t eventuate at all.   

In our view, the most successful projects are those that are responsive and allow for change along the way – and therein lies the key difference between Agile methodology and following more conventional methods of IT project management.

Agile Methodogoly

A fresh approach to project management.

To further explain Agile we need to compare it to a traditional project management methodology – let’s take Waterfall, as an example.

Waterfall is a linear project management model that relies on one phase to be completed in sequence before another can begin. While timelines can be relied on and there (should be!) no scary surprises with cost, there is rigidity in this approach. Changes cannot be easily implemented and long-term projects may struggle to remain on track.

Agile on the other hand offers greater flexibility, making it well suited to ongoing projects of a complex nature. As explained by Agile Alliance, Agile is an overarching term for “a set of frameworks and practices based on the values and principles expressed in the Manifesto for Agile Software Development and the 12 Principles behind it”. It is in essence a more realistic approach to how projects with many components should be managed.

The four values that underpin Agile are: 

  1. Individuals and interactions over processes and tools
  2. Working product over comprehensive documentation
  3. Customer collaboration over contract negotiation
  4. Responding to change over following a plan

It’s important to note that the Agile methodology doesn’t suggest that all documentation, contracts, or plans should be disregarded or thrown out with the trash; rather that the focus of a project should be on collaboration, flexibility and responsiveness – and when these values have adhered to businesses can expect boosted productivity and higher-quality output. 

What’s in a framework?

The above values and their associated principles form the basis of the Agile methodology, but to be successfully implemented a framework or system needs to be implemented. The three most widely used options are:

  • Scrum (Used by Netflix, Adobe, Amazon and Apple)
    • Scrum works well for large scale, complex projects that rely on multiple tasks to be actioned as quickly and efficiently as possible. Projects are divided into manageable tasks, or ‘sprints’, which are monitored in daily Scrums (meetings). The regularity of the Scrums allows for continuous feedback and collaboration, and as each team member has clearly defined sprints Scrum promotes a culture of transparency. On the flipside, disadvantages to be aware of include an optimum team size to make it work (minimum three, maximum nine/ten) and the need for one or more team members to have the right experience to provide meaningful feedback.
  • Kanban (Used by Zara, Spotify, Pixar and Toyota)
    • Team members in a Kanban Agile environment are across a project’s workflow in real-time via a Kanban board. The board – which may be as literal as a whiteboard or a software program – forms the basis of the Kanban framework, with a project’s associated work items and their status able to be visualized at a glance. Considered an ideal framework for businesses new to Agile, it’s important to note that Kanban does run the risk of overcomplicating a project – so team members may still need a WIP to minimize confusion.
  • Extreme Programming (XP) (Used by IBM, Ford Motor)
    • Popular among software developers, XP enables teams to produce high-quality outputs while allowing for customer changes to be implemented throughout the process. This is enabled due to the level of testing undertaken, frequency of releases and an open channel of communication between customer and developer. While many programmers may find the idea of direct customer contact unnerving, the XP methodology is heavily reliant on mutual respect for it to work – and daily scheduled ‘stand up’ meetings should reduce the occurrence of ad hoc requests.

So, should I choose Agile?

No two businesses or cultures are the same – so knowing if Agile is right for your organization can only be determined by you and your stakeholders. But to guide you with your decision-making, several factors to be taken into consideration include:

  • What is the organizational structure like; does it allow individuals to work in smaller teams without reliance on those in leadership positions?
  • Do the types of IT projects you produce allow for collaboration both internally and externally?
  • Is the culture of your workplace or team flexible in nature and are individuals open to change?
  • Will Agile and your chosen framework allow people to deliver their best possible work?

Final thoughts

It’s important to have a healthy project pipeline – and we know that success is built on a collaborative approach. A team must be keen on their framework of choice and the rest of the members must be clear about their outcomes and daily stand-ups to keep the entire team on the same page and regular sprint demos that enable customers to provide instant feedback. The Agile methodology may not be right for others – so it’s important that businesses considering this approach undertake proper due diligence before taking the leap.


This article by Bilal was originally published at Makeen.io


Featured Image Credits: Pixabay

Digital transformation will create trillions of dollars of value. While estimates vary, the World Economic Forum in 2016 estimated an increase in $100 trillion in global business and social value by 2030. Due to AI, PwC has estimated an increase of $15.7 trillion and McKinsey has estimated an increase of $13 trillion in annual global GDP by 2030. We are currently in the middle of an AI renaissance, driven by big data and breakthroughs in machine learning and deep learning. These breakthroughs offer opportunities and challenges to companies depending on the speed at which they adapt to these changes.

Modern enterprises face 5 key challenges in today’s era of big data

  1. Handling a multiplicity of enterprise source systems

The average Fortune 500 enterprise has a few hundred enterprise IT systems, all with their different data formats, mismatched references across data sources, and duplication

  1. Incorporating and contextualizing high frequency data

The challenge gets significantly harder with increase in censoring, resulting inflows of real time data. For example, readings of the gas exhaust temperature for an offshore low-pressure compressor are only of limited value in of itself. But combined with ambient temperature, wind speed, compressor pump speed, history of previous maintenance actions, and maintenance logs, this real-time data can create a valuable alarm system for offshore rig operators.

  1. Working with data lakes

Today, storing large amounts of disparate data by putting it all in one infrastructure location does not reduce data complexity any more than letting data sit in siloed enterprise systems.

  1. Ensuring data consistency, referential integrity, and continuous downstream use

A fourth large data challenge is representing all existing data as a unified image, keeping this image updated in real-time and updating all downstream analytics that use these data. Data arrival rates vary by system, data formats from source systems change, and data arrive out of order due to networking delays.

  1. Enabling new tools and skills for new needs

Enterprise IT and analytics teams need to provide tools that enable employees with different levels of data science proficiency to work with large data sets and perform predictive analytics using a unified data image.

Let’s look at what’s involved in developing and deploying AI applications at scale

Data assembly and preparation

The first step is to identify the required and relevant data sets and assemble them. There are often issues with data duplication, gaps in data, unavailable data and data out of sequence.

Feature engineering

This involves going through the data and crafting individual signals that the data scientists and domain experts think will be relevant to the problem being solved. In the case of AI-based predictive maintenance, signals could include the count of specific fault alarms over the trailing 7 days,14 days and 21 days, the sum of the specific alarms over the same trailing periods; and the maximum value of certain sensor signals over those trailing periods.

Labelling the outcomes

This step involves labeling the outcomes the model tries to predict. For example, in AI-based predictive maintenance applications, source data sets rarely identify actual failure labels, and practitioners have to infer failure points based on a  combination of factors such as fault codes and technician work orders.

Setting up the training data

For classification tasks, data scientists need to ensure that labels are appropriately balanced with positive and negative examples to provide the classifier algorithm enough balanced data. Data scientists also need to ensure the classifier is not biased with artificial patterns in the data.

Choosing and training the algorithm

Numerous algorithm libraries are available to data scientists today, created by companies, universities, research organizations, government agencies and individual contributors.

Deploying the algorithm into production

Machine learning algorithms, once deployed, need to receive new data, generate outputs, and have some actions or decisions be made based on those outputs. This may mean embedding the algorithm within an enterprise application used by humans to make decisions – for example, a predictive maintenance application that identifies and prioritizes equipment requiring maintenance to provide guidance for maintenance crews. This is where the real value is created – by reducing equipment downtime and servicing costs through more accurate failure prediction that enables proactive maintenance before the equipment actually fails. In order for the machine learning algorithms to operate in production, the underlying compute infrastructure needs to be set up and managed.

Close-loop continuous improvement

Algorithms typically require frequent retraining by data science teams. As market conditions change, business objects and processes evolve, and new data sources are identified. Organizations need to rapidly develop, retrain, and deploy new models as circumstances change.

Therefore, problems that have to be addressed to solve AI computing problems are nontrivial. Massively parallel elastic computing and storage capacity are prerequisites. In addition to the cloud, there is a multiplicity of data services necessary to develop, provision, and operate applications of this nature. However, the price of missing a transformational strategic shift is steep. The corporate graveyard is littered with once-great companies that failed to change.


This article by Tamjid Aijazi originally appeared on Makeen Technologies.


Featured Image Credits: Pixabay

Energy is the lifeblood of all societies. But the production of energy from the burning of fossil fuels produces carbon emissions that are released into the atmosphere on a grand scale. The energy sector accounts for more than 70% of these emissions, which are driving climate change worldwide.

Reducing carbon emissions from the energy sector has a direct and positive impact on climate protection. So there needs to be a transition from the current energy system that relies heavily on fossil fuels to a system that uses renewable energy sources that do not emit carbon, such as wind and solar.

We also need to look at things like the electrification of transport and embrace a circular economy that seeks to reduce waste and the demand for energy. This process has already begun, but we need to speed it up – we’ve been dragging our heels for too long and now things are critical.

This will not happen by itself; it requires policy choices. These must be global, involving all states. It’s no good changing the energies sector of just one country. Energy has long been considered to fall within the domain of domestic policy. Yet international climate action is driving the transition to a low-carbon energy economy, on the basis of scientific evidence that highlights the importance of reducing energy consumption for the climate.

This must be done as quickly as possible. Some countries are more committed than others, but the extent of how much is actually being achieved (or not) must be monitored. This can only happen through the cooperation of all states under international law. Cooperative regulation of energy demands innovative, flexible organisation and law-making at international and the regional level.

Energy action = climate action

The United Nations (UN) is at the forefront of this international cooperation. In 2015, the General Assembly adopted Sustainable Development Goals (SDGs) that set out the progress the global community wants to make by 2030 on the most pressing challenges, from proverty reduction to climate change and energy transition.

SDG 7 relates to ensuring access to clean and affordable energy for all. It contains indicators of progress on renewables, access to electricity and energy efficiency. SDG 13 relates to urgent action to combat climate change and its impacts. These two goals work in tandem to encourage all states – developed and developing – to collaborate to make energy sustainable (meaning low-carbon), while ensuring access for all in every country by 2030.

That means international climate action equals energy action. The UN High-Level Political Forum is the place states get together to discuss progress on the SGDs, and where consensus is being (re)affirmed continuously.

SGDs 7 and 13 have been established and reinforced through the 2015 Paris Agreement on Climate Change. This is a binding treaty under international law adopted through the UN Framework Convention on Climate Change, by which the UN first addressed climate change in 1992. The agreement is the key international legal framework through which states aim to keep the increase in the temperature of the Earth’s atmosphere to well below 2℃, and ideally limiting it to 1.5℃ by the end of the century.

Signing up to cooperation

Almost all states have ratified the Paris Agreement and so must abide by it. If any intends to withdraw from it, they must abide by the legal rules of the agreement. So the US would only be able to withdraw – as Donald Trump insists – after the next presidential election. In the meantime, his administration continues to abide by the Paris Agreement rules and actually takes a very active role in the negotiations.

Domestic action is necessary to implement the promises of the Paris Agreement. Every state is obliged to submit “nationally determined contributions” that set the scene for the most ambitious climate protection plan at the national level.

These national plans on climate protection have a strong influence on energies regulation at the domestic level. The “Katowice package” (the Paris rule book), adopted in 2018, provides further guidance. For developed countries, the Paris Agreement stipulates that they adopt eceonomy-wide greenhouse gas emission targets. These targets can only be achieved if the entire economy, including the energy sector, is “decarbonised”. That means that the use of fossil fuels has to end and be replaced by sustainable (renewable) energy.

Developing countries receive support under the Paris Agreement so that they too can move over time to economy-wide reduction targets. Only by acting together will the international community achieve the temperature goal of the Paris Agreement.

The 1994 Energy Charter Treaty, driven by the European Union and like-minded states, is emerging as the basis of transcontinental energy governance in Europe, Asia and Africa. This treaty covers energy investments, trade, freedom of energy transit, efficiency, and resolution of disputes. It is now modernising to support the energy transition.

Cooperative energy regulation also occurs on a regional level, and that is the case in Europe as well as Asia and Africa. The EU has adopted a frontrunner position with a strategy precisely based on the Paris Agreement till 2080, driving the transition of the continent’s energy system. Called the Clean Energy Package, it will create a transboundary, continent-wide energy system that better integrates renewables, improves efficiency and empowers consumer choice. Even after Brexit, the UK will likely remain connected to this market, as both the EU and the UK share the objective of achieving net zero carbon by 2050

If humanity is to achieve its goal of fully and speedily transitioning to low-carbon energy while ensuring affordable access for all, then we must stay focused and committed and continue to cooperate internationally. The future of the generations that follow depends on it.


This article by Volker Roeben was originally published on TheConversation

About the Author:

Volker Roeben is Professor of Energy Law, International Law and Global Regulation at the University of Dundee, as well as a visiting Professor at the China University of Political Science and Law, Beijing, docent at the University of Turku and adjunct Professor at the University of Houston.

Prior to coming to Dundee, he was a Professor at Swansea University and a Senior Research Fellow at the Max Planck Institute for Comparative Public Law and International Law. He has held visiting professorships inter alia at the University of Chicago School of Law, has served as a clerk to Justice Di Fabio of the German Constitutional Court, and advised the Energy Charter, the European Parliament, international organisations and national parliaments.

Volker’s research combines energy law with public international law, European Union law and the theory of global law, with several books and numerous articles published and a research monograph on the EE Union in press with Cambridge University Press. He also serves on the board of the Max Planck Encyclopedia of Comparative Constitutional Law.


Featured Image Credits: Pixabay

 


Clickbank Promo Tools
Advertisements
E-books and Software
Internet Marketing Product Reviews

Popular Posts

What People Love

Company Info

This website is a project by:

TNZ Web Solutions, Tauranga, New Zealand

TNZ Web Solutions is part of ZedBee Limited
NZ Companies registration nr. 5397562 (records)

Menu

Contact

3/12 Cypress Street
Tauranga 3110, New Zealand

Email

© Artisynq Content Network 2020

‌ As an Amazon Associate we may earn commissions from qualifying purchases.