Uncategorized

DisplayPort – DP

DisplayPort – DP

 

DisplayPort (DP) is an audio/video (A/V) display interface used to connect a video source to a display device. For example, you may connect a computer monitor to the PC using the DisplayPort. DisplayPort primarily replaces older interface technologies, including VGA and DVI.

The interface is typically found on on tablets, notebooks, and desktop computers and monitors. The display interface is also included on some digital televisions but is more often associated with devices related to computing and digital consumer electronics (CE).

DisplayPort - DP

DisplayPort Specification

DisplayPort is developed by a consortium of PC and chip manufacturers and standardized by the Video Electronics Standards Association (VESA). The DisplayPort specification calls for full A/V performance (up to 8k at 60Hz), SuperSpeed USB (USB 3.1) data, up to 100 watts of power over a single cable, reversible plug orientation and cable direction and adapters that support HDMI 2.0a and full 4K UHD resolution.

The current version is DisplayPort 1.4a, published in April, 2018. This version defines the new normative requirement and informative guideline for component and system design.

DisplayPort Compatibility

DisplayPort primarily replaces older interface technologies. While newer GPUs and electronic computing devices are DisplayPort-ready, the interface is backward compatible with other interfaces. This allows PC owners to use a passive or active adapter (also called plug adapter or adapter cable) to connect DisplayPort enabled devices to older monitors or projectors using DVIHDMI and VGA technologies.

DisplayPort Versus HDMI

While there may be some overlapping competition between HDMI and DisplayPort, the two specifications are quite different and each has a different product focus. Where HDMI is considered to be the de-facto connection for home entertainment systems and is widely available on HDTVs as an A/V interface, DisplayPort was developed to support the higher performance requirements of personal computers, is based on updated signal and protocol technology.

Apple Mini-DisplayPort (mDP)

The Mini DisplayPort is a smaller, Apple version of the DisplayPort published by Apple and used to connect a Mac that has a Mini DisplayPort, Thunderbolt, or Thunderbolt 2 port to a display that uses a DVI or VGA cable.

Uncategorized

BEC – business email compromise

BEC – business email compromise

Business email compromise (BEC) is a type of corporate financial scam that specifically targets organizations conducting business abroad. This scam relies upon the attacker’s ability to successfully impersonate communications from a company stakeholder that would be tasked with instructing other high-level employees in conducting business transactions and using wire transfers to pay manufacturers and suppliers. Spoofing or compromising these specific corporate employee email accounts can result in fraudulent transfers.

Often in BEC security scenarios, the attacker will impersonate the high level employee and provide instructions for employees to share information or conduct transfers with a fictitious supplier.  In other reported crimes, the attacker creates fake documents and invoices to impersonate the foreign manufacturer or supplier.

It has also been noted that attackers may initiate the BEC scam by targeting employees in HR to obtain personally identifiable information (PII) of stakeholders and other key employees to be used in future attacks.

Note: Business email compromise (BEC) is also called business email spoofing (BES)

The Five Common Types of BEC Attacks

According to security firm Trend Mico there are five types of BEC attacks to be aware of:

  1. Bogus Invoice: Attackers pretend to be the suppliers requesting fund transfers for payments to an account owned by fraudsters.
  2. CEO Fraud: Attackers pose as the CEO or any executive and send an email to employees in finance, requesting them to transfer money to the account they control.
  3. Account Compromise: A high-level employee’s email account is hacked and used to request invoice payments to vendors listed in their email contacts. Payments are sent to fraudulent accounts.
  4. Attorney Impersonation: Attackers pretend to be a lawyer or from the law firm supposedly in charge of crucial and confidential matters.
  5. Data Theft: Employees under HR or bookkeeping are targeted to obtain personally identifiable information (PII) of employees and executives to be used for future attacks. (Source)

While business email compromise attacks use email and other forms of technology and digital communications to be successful, the scam does not use technical security exploits, making it difficult for organizations to detect.  Most security firms recommend employee education and additional security awareness training to identify and avoid BEC scams.

Operation WireWire

In June 2017, an FBI operation managed to bring down an international criminal organization whose main activity was business email compromise (BEC). The operation, known as WireWire, led to 74 arrests in seven countries, and the retrieval of 16.2 million dollars. (Source: Panda Security)

Uncategorized

IT boot camp

IT boot camp

Information Technology boot camps, or IT boot camps, have taken off in popularity in recent years as a way for employees to advance their careers and prospective employees to earn new jobs by gaining key programming skills in a short but often very intensive timeframe.

While IT boot camps didn’t exist prior to 2012, today there are over 100 IT coding boot camp companies and schools to choose from. A few of the best known and most reputable IT boot camps include App Academy, Hack Reactor, Hackbright Academy, General Assembly, Coding House, Dev Bootcamp, and Bloc.

These technology coding bootcamps offer instruction on a wide array of computer programing languages, philosophies, and skillsets, including HTMLJavaScriptPythonPHPRuby on Rails, and more.

IT boot camps span an average of 12 weeks but can vary from 6 to 28 weeks or more in length, depending on the subject matter and the school or company providing the IT coding boot camp instruction. Most IT boot camps provide intensive, accelerated learning curriculums with hands-on projects where students develop their own software programs, web apps, and other digital tools.

IT Boot Camps Come in All Shapes and Sizes

When it comes to selecting an IT boot camp, options abound, with full-time, part-time, and online coding boot camps all available to select from, as well as locations across the country. Additionally, you’ll find IT boot camps run by independent organizations, by reputable colleges and universities, and also a hybrid of the two where an independent IT boot camp company partners with a reputable school for collaborating on boot camp offerings.

Full-time IT boot camps are often extremely intensive and require up to 80 hours per week of work over a two- to seven-month period of time. As a result of the time demand, this often means the student needs to take time off from their full-time position if they have one.

These full-time IT boot camps can also be quite expensive, although some companies will pay a portion or all of the expense of the coding boot camp for an employee if it’s an extremely valuable skill that the company lacks.

For those with less time or money to commit to a full-time boot camp, part-time IT boot camps or online bootcamps are often a more attractive option. Part-time IT boot camps typically require a commitment of 20 hours or so a week, which can be spent on location or online depending on the boot camp.

These part-time and online IT boot camps also tend to be more affordable than their full-time boot camp counterparts, making them an ideal option for those needing additional flexibility or a more economical boot camp option.

Are IT Boot Camps Accredited?

Most IT boot camps are not accredited in the same way colleges and universities are, and they are not able to grant degrees to students who successfully complete the boot camp. Many of the larger, more recognizable IT boot camps do however work with state agencies to become accredited as an IT boot camp.

And colleges and universities that offer their own IT boot camps or partner with IT boot camp companies are able to rely on their reputation and accreditation to provide boot camps that are frequently held in high regard by employers.

Some of the best-known schools providing reputable IT boot camps today include Northeastern University (Level), the Coding Bootcamp at UT Austin, Georgia Tech Coding Bootcamp, the Coding Bootcamp at UNC Chapel Hill, Northwestern Coding Bootcamp, University of Minnesota, Seattle University Coding Bootcamp, DeVry Bootcamp, California Coding Bootcamps (at UCLA, UC Irvine, UC Berkeley and UC San Diego), and Rutgers Coding Bootcamp.

Alternatives to / Preparation for IT Boot Camps

IT boot camps often require a serious commitment in terms of both time and money, and are not for everyone as a result. For those just getting started with coding, a free online course often makes more sense and can be an economical alternative to an IT boot camp or a great way to get hands-on knowledge of coding without a significant investment.

Free online coding courses from Codeacademy, Code School, Coursera, and the online Computer Science and Programming courses through Harvard (edX) and MIT (OpenCourseware) are great places to start if you’re just getting started with coding or are considering an IT boot camp in the future.

Uncategorized

Scrum Master

Scrum Master

Scrum Masters are responsible for leading and managing complex projects in Scrum, an agile methodology and framework. A Scrum Master’s primary responsibilities include ensuring the development team stays on track for development milestones while also reviewing the work in progress to ensure the Scrum group adheres to best practices.

A Scrum Master is also tasked with helping both those involved directly within a Scrum team as well as those impacted by the Scrum team’s work understand Scrum theory, practices, rules and values.

A Scrum Master is an advocate for Scrum theory and the work achieved by the Scrum Master’s team, and continually seeks to cause change in an organization that will increase the productivity, quality, and efficiency of the Scrum Team while also boosting the company’s return on investment (ROI) as a result of the Scrum Team’s work.

Daily Sprints and the Scrum Master’s Role in Them

In a scrum, a team typically works on “sprints,” which are two-week development cycles that include short daily stand-up meetings in which the scum master leads the quick 5- to 10-minute meeting. Each team member gets a chance to speak for roughly a minute in these stand-up meetings, and succinctly covers:

1. What they did yesterday

2. What they will be doing today

3. What issues or blockers are preventing them from accomplishing their tasks

The Scrum Master takes all of this information and works to help resolve issues while also encouraging and coaching the development team to be self-sufficient and self-organizing as much as possible when it comes to their tasks and challenges.

Scrum Masters vs. Project Managers

Scrum Masters are often compared to Project Managers, and while the roles are sometimes mistakenly considered to be similar, the two positions are actually quite disparate and require very different skill sets in terms of personality, experience and management style.

A Project Manager is closer to the role of a Project Owner in a Scrum Team, and they serve as the leader and overall decision maker who is directly accountable to the company for managing the project and accomplishing the objectives for the project.

The Scrum Master on the other hand serves in more of a coaching and facilitating role, supporting the Project Owner to coach the team to be as efficient as possible and resolve any roadblocks, personality issues, disagreements or other impediments while also ensuring the Scrum process is followed correctly and its value is maximized within the team and for the company overall.

How Much Do Scrum Masters Make on Average?

As the most widely-used framework for agile development, Scrum-certified employees, and particularly Scrum Masters, are held in high demand and typically command a high salary as a result.

The average annual salary for a Scrum Master in the United States is just under $88,000 as of 2018, with Scrum Master salaries typically ranging from a low of about $75,000 per year to just over $100,000 per year, depending largely on the size of the organization and industry.

While a number of companies provide Scrum training and certification programs, Scrum Alliance and Scrum.org are considered by many to be the most reputable Scrum certification organizations.

Uncategorized

threat intelligence

threat intelligence

Threat intelligence is the knowledge of the capabilities, resources, motives, and goals of potential security threats to an organization and the application of this knowledge in protecting against security breaches and data theft.

Threat intelligence is a continually evolving process that involves identifying potential security threat actors, understanding their motives and likely avenues for compromising security, and implementing policies and processes that prevent threat actors from compromising an organization as well as limiting the amount of damage they can cause if they are able to breach security.

The Continually Evolving Threat Intelligence Cycle Process

Because threat actors never stop developing and testing new techniques for their cyberattacks, threat intelligence is an on-going, circular process or cycle rather than an end-to-end process. As such, the threat intelligence cycle involves a continual process of planning, implementing, analyzing, optimizing and refining security data collection to better identify all of the following (and more):

Threat Intelligence
Image Source: Accenture.com

Why Is Threat Intelligence Important?

The goal of the cyber threat intelligence process is to produce threat intelligence reports and insight that can be analyzed by corporate security or third-party security intelligence services to implement and/or improve automated security software as well as increase employee knowledge of potential security attacks on the company.

Overall, threat intelligence is designed to keep an organization, its security staff, and all of its employees informed of the security risks the company faces and how best to protect against these threats as well as new ones likely to emerge.

Uncategorized

Apple WatchOS

Apple WatchOS

Apple WatchApple WatchOS is the company’s operating system developed to power Apple Watch smartwatch devices. WatchOS is based on Apple’s iOS mobile operating system that powers iPhoneiPad and iPod Touchdevices.

The WatchOS made its debut on April 24, 2015, when the Apple Watch was officially released to the public. An API called WatchKit was also released for developers to create WatchOS apps for the Apple Watch.

Apple Releases the WatchOS 2 Update

Apple introduced a major update to the WatchOS in September 2015 in the form of WatchOS 2. Apple WatchOS 2 delivers new features and functionality for Apple Watch devices, including more native apps with access to the smartwatch’s sensors and controls like the Taptic Engine, Digital Crown and more.

Additional new features include a “Time Travel” feature that displays your upcoming events up to 72 hours in the future (or prior for past events), tetherless Wi-Fi for connecting to Wi-Fi networks without needing to first connect to a paired iPhone, new watch faces, improved support for Siri and a nightstand mode.

WatchOS 3 Arrived in 2016

Apple announced the WatchOS 3 mobile operating system in the fall of 2016 in conjunction with the debut of the Apple Watch Series 2. The WatchOS 3 update offered improved performance over earlier release, as well as a new Dock for easier navigation, the ability to launch favorite apps quickly, and new and improved fitness and health capabilities for the Apple Watch.

WatchOS 4

Apple unveiled the watchOS 4 mobile operating system in September 2017 along with the new Apple Watch Series 3. Apple watchOS 4 offered improved health and fitness features (including a Quickstart interface for faster starts to workouts and a High Intensity Interval Training workout option), plus personalized Activity coaching, enhanced measurements in the Heart Rate app, a new Siri watch face and improvements to the Music experience.

WatchOS 5 Debuts with Apple Watch 4 in 2018

Apple announced the watchOS 5 mobile operating system in the fall of 2018 in conjunction with the debut of the Apple Watch Series 4. With watchOS 5, Apple has added an array of new health features to go along with support for Walkie-Talkie mode, Notifications enhancements, Podcasts, an updated Siri watch face, and a redesigned Heart App.

Health and fitness improvements and new features in watchOS 5 include a new ECG app on the Apple Watch 4 that allows the wearer to take an electrocardiogram (ECG), Activity competitions, automatic workout detection capabilities, and Yoga and Hiking workout options. A 5.2 update for watchOS 5 expanded the ECG support beyond the U.S. and also added support for the 2nd generation of Air Pods.

Uncategorized

ARM Servers

ARM Servers

 

An ARM server, or advanced RISC machine server, is a computer server system comprised of a large array of ARM processors as opposed to the x86-class processors traditionally used in servers. ARM servers are touted for being able to provide similar or greater processing power than x86 server counterparts while consuming less energy and producing less heat.

As a result, ARM servers have become more and more popular in recent years and are now frequently deployed in enterprise data centers and cloud deployments. However, despite projections of massive growth for ARM servers, the road to mainstream adoption and gaining market share on Intel and its x86-based servers has not gone smoothly for ARM servers and ARM server processor manufacturers.

The Current State of the ARM Server Market

A number of ARM server chip manufacturers have attempted to push the ARM server industry forward, only to see their efforts come up short. Applied Micro Devices (AMD), Broadcom (with its Vulcan ARM server chips), Qualcomm (with its Amberwing processor designs), Calxeda, Cavium and others have put massive amounts of money into the ARM server market only to later cease operations or find themselves acquired by other companies.

In their place, companies like HP with its Moonshot server systems, AWS with its Graviton ARM server chip, and Huawei Technlogies, with its HiSilicon subsidiary, have been pushing the ARM server market forward.

Their efforts are helping ARM servers gain footholds in hyperscalecloud and enterprise data center computing environments, but Intel and its x86 servers still outpace ARM server deployments by a huge margin.

Uncategorized

disruptive innovation (disruptive technology)

disruptive innovation (disruptive technology)

 

Disruptive innovation is a term used to describe a product or service that starts out small or simple and moves quickly through the lower tiers of a market segment, often displacing established businesses and technologies in lower margin sectors.

Generally, a disruptive innovation focuses on lower profit markets and customers, allowing a new startup or emerging business to be profitable by targeting customers who are underserved in the existing market. In time, the new product or service moves upwards in the market and can eliminate established companies who are slow to react or respond to the innovations.

Examples and Origins of the Phrase

Netflix is an example of a successful disruptive innovator business. When Netflix started, there was no direct competition in the established market (i.e. Blockbuster) but the innovative on-demand streaming model disrupted Blockbuster by focusing on a segment of underserved customers to gain industry presence.

Another classic example of disruptive innovation is the PC market. Personal computers were an innovation that improved over time and eliminated the mainframe industry. Blockchain is another example of a disruptive technology in financial markets.

The phrase disruptive innovation was coined by Harvard Business School professor, Clayton M. Christensen in his research on the disk-drive industry and later popularized by his book The Innovator’s Dilemma, published in 1997.

Today, the phrase disruptive technology is a preferred synonym of disruptive innovation.

Uncategorized

Multi-Factor Authentication (MFA)

Multi-Factor Authentication (MFA)

Multi-Factor Authentication, or MFA, is a form of authenticating users that utilizes more than one method of identification when connecting to a secure site or service. This provides an additional layer of security over traditional forms of single sign-on authentication, which require only one type of identification such as a password.

Also known as two-factor authentication, or 2FA, multi-factor authentication can take several forms, but most typically utilizes either a smartphone or a smart card in conjunction with a password or PIN. More advanced forms of MFA will sometimes rely on biometrics such as fingerprint recognition or retina scanning for the second form of authentication.

How Multi-Factor Authentication Normally Works

A user logging into a site secured with MFA will typically enter a password and then will have a code sent to the smartphone number (or email address) associated with the account. Only after the correct password and the authentication code have been successfully entered will the user be able to access the site or service.

While not completely foolproof, MFA does make it harder for hackers to log in to a user’s account, as they would need to have both the user’s password and access to their smartphone or similar device in order to correctly provide both forms of authentication.

As a result, MFA is typically deployed for protecting access to more sensitive or mission-critical information within an enterprise. MFA is also increasingly becoming a requirement in government mandates for regulation and compliance.

Uncategorized

full stack

full stack

 

Full stack in technology development refers to an entire computer system or application from front end (customer or user-facing) to the back end (the “behind-the-scenes” technology such as databases and internal architecture) and the software code that connects the two.

In computer software and application development, full stack engineers are those with the skills and expertise to manage employees and projects and develop the code that will create or optimize integration between front end and back end systems.

Front-end development involves the creation or optimization of visible parts of websites or applications (the “client” end) that users view and interact with in their web browsers or on their mobile devices. These front-facing apps and websites are typically created using tools like HTMLCSS, and JavaScript.

Back-end development, on the other hand, involves creating and refining software code that integrates and communicates with an enterprise’s existing databases, software and other infrastructure (the “server” end) so that the front-end websites and/or apps can deliver the information visitors or customer need.

What Skills and Programming Languages Do You Need to Be a Full Stack Developer?

Those interested in pursuing a career as a full-stack developer or full-stack engineer will need to be well-versed in a variety of computer programming languages. Full-stack developers need to be proficient in languages used for front-end development like HTML, CSS, JavaScript, and third-party libraries and extensions for Web development such as JQuery, LESS, SASS, and REACT.

For full stack developers, mastery of these front-end programming languages will need to be combined with knowledge of UI design as well as customer experience design for creating optimal front-facing websites and applications.

Full-stack developers and engineers also need to know how to code in back-end languages like JAVAPHPRubyPerlPython, and C in order to ensure their front-end web sites and applications are able to integrate with back-end business applications and infrastructure. Knowledge of databases and languages to interact with databases like MySQLOracle and Microsoft SQL Server is essential for full-stack developers as well.

Full Stack Developer: Jack of All Trades / Master of None?

It’s important to note that full-stack developers and engineers are not expected to be masters in all of these programming languages; rather, those in a full-stack development role should have a solid proficiency in one or more of these languages while also understanding how front-end and back-end technology need to integrate and work together efficiently.

Hands-on experience is the best way to become a competent full-stack developer, but for those new to full-stack development or programming and technology in general, coding bootcamps and online programming courses from sites like General Assembly, Coursera, Thinkful, and Lynda can be helpful for getting started and learning some of the essential skills needed to advance a career in full-stack development.

Full Stack vs. Technology Stacks

The term full stack is sometimes confused with a closely related term, technology stack. Technology stacks refer to a collection of specific software tools and technologies that can be used in conjunction to develop webites, apps and other software programs.

Web tech stacks such as the common LAMP (Linux, Apache, MySQL and PHP) are frequently used in full stack development, but the terms full stack and tech stack are not completely interchangeable.

Technology stacks are combinations of specific software tools and technologies that help with software development, whereas full stack is a term that applies to the overall development of projects that combine front-end and back-end systems, and entails the entire development process from conception and planning all the way through coding, testing, sending system into production, and optimizing existing systems.