Wednesday, July 17, 2019

A Look at 8K TVs and Monitors

   TVs and monitors have become increasingly popular as more and more people use high resolution screens at work and at home. Most TVs have a resolution of 3840 by 2160 pixels. These TVs are known as 4K or UDH (ultra-high-definition). Some computer monitors are 4K, but many are still 1080p or Full HD, which is 1920 by 1080 pixels. But now, some companies have begun selling 8K TVs and monitors, with a resolution of 7680 by 4320 pixels. This means that it is 4 times sharper in image quality compared to UHD. Here are some 8K products available:


   Samsung was one of the first companies to release an 8K TV and it has made many variants since then. It sports one of the biggest TVs at 98” with more than 33 million pixels. It also utilizes Quantum HDR which allows for colours that feel more vibrant by increasing depth for clarity and making details sharper. The use of 8K Direct Full Array technology assists in this by improving contrast, making colours more precise. Samsung’s website sells 6 variants of the Class Q900 QLED Smart TV ranging from 55” to 98”. The 98” variant is sold at $99,999.99, but at the time of writing, this was on sale for $69,999.99, a 30% discount. The 55” variant has a much more reasonable price at $3499.99. Additional soundbars with Dolby Atmos, subwoofers, rear speakers, and warranty are also available for purchase.

Check out the Class Q900 here


   Dell has been known for its monitors, but it now presents itself with the world’s first 31.5” 8K monitor, the Ultrasharp 32. The Ultrasharp 32 has won several awards including “Best Monitor” and “Best Pixels” at CES 2017. It utilizes Dell PremierColor, which supports all major industry standards of colour such as sRGB, AdobeRGB and Rec. 709, coming out to a total of 1.07 billion colours. This monitor uses two DisplayPort inputs and 3 USB ports and has a refresh rate of 60Hz

Check out the Ultrasharp 32 here


   Sony has showcased their 8K TV but hasn’t released it yet. It was first showcased at CES 2019 and boasted technology that could scale lower-resolution images to 8K. It also used Smart TV coupled with Android TV to offer features such as Google Assistant, Google Play, and Netflix built in. At the time of writing, Sony’s Z9G, part of the Master Series, was not yet priced, but 85” and 98” variants were showcased.

Check out their 8K TV here

Tuesday, July 16, 2019

Starlink: Science Fiction or Something from the Near Future?

   Everyone knows that Elon Musk’s projects revolutionize their industries and catalyze humanity’s progress - and his new proposal is not an odd one out. Starlink, the satellite constellation, would be a giant collective of 12,000 satellites orbiting the Earth and providing high-speed internet to even the most remote locations on the planet. As of May 23, 2019 SpaceX has launched 60 satellites for testing purposes on their Falcon 9 rocket. Currently, Starlink is projected to begin officially operating in 2021. However, the question remains: How can the high volume of devices be sent into space and is the task even possible?

   First, let’s take a closer look at the financial considerations of this project. Each satellite weighs around 250 kilograms. Considering $20,000 is the approximate price of sending 1kg on orbit on the Atlas V type rocket, it would cost NASA 60 BILLION dollars just in transportation costs to implement Starlink. Nevertheless, Elon Musk has an answer for this problem. With the SpaceX’s Falcon Heavy rocket, which uses a reusable booster system, the price of launching 1kg to space drops to $1,700. This then drives the cost to launch all the 12,000 Starlink satellites to 5.1 billion dollars instead.

   At this point, many of you might ask why it’s necessary to have this myriad of satellites hovering over the Earth? The answer becomes clear when taking the developing countries into account. According to Internet World Stats, only 38% of the population in Africa has access to the internet compared to 90% in North America. By introducing the Internet to these individuals, they would then be able to utilize the nearly bottomless pool of educational and academic resources. The poor access to the internet drives the Human Development Index to the low end and creates a loss in human capital. Currently, the price of internet in Africa is roughly $35 per gigabyte. This makes it almost impossible for the general population in these countries to afford the Internet, as the average daily income is only $6. The Starlink project would drive down this price. It would allow villages to purchase a $200 receiver and then individuals would be able to pay a severely reduced fee to obtain access the internet.

   Now the last question remains: But how would Elon Musk benefit from the project? Well, the answer is quite clear; it’s through the revenue generated from Starlink. Musk predicts this project has the potential to bring in $30 billion dollars by 2025. This would provide additional funding for the other SpaceX projects, including the human mission to Mars.

   So is the Starlink project something out of a sci-fi book? The short answer is no. We currently possess the technology to implement a project of such scale for a relatively low price. However the fact that this project works on paper doesn’t mean that it would translate smoothly in the real world. There remain many unknown factors that will only come to light after the start of such a grandiose plan.


Sunday, July 14, 2019

Forest Restoration: A Solution to Reducing Global Warming

A recent study conducted at the Swiss Federal Institute of Technology of Zurich shares how forest restoration can possibly be the best solution to slowing down global warming.

Industrial activities and automotive emissions contribute to about 24 million tonnes of carbon dioxide per year around the globe, according to the United States Geological Survey (USGS). This can cause several species to die, biodiversity to reduce and pollute our earth significantly.

Tom Crowther from the Swiss Federal Institute of Technology of Zurich and his team analyzed about 80 000 satellite photo measurements of the trees that exist worldwide. They then used data from other geological databases that examined climate conditions to examine which areas of the earth could hold more trees. They then deduced an approximate figure of how many trees the earth could support.

The results showed about 0.9 billion hectares of land exist around the world that could be used to build forestry. The growth of trees in these areas can create 205 gigatons of carbon that can be contained and consequently slow down global warming significantly.

This will have a positive impact on our future climate conditions and reduce chances of climatic disasters, such as tsunamis and earthquakes. If we as citizens don’t protect our earth now by reducing our carbon emissions and planting more trees, even developed countries may experience serious disasters such as water shortage and drought.

As forest restoration grows around the world, this reduces the need to employ more expensive carbon sequestration procedures.

The idea of ocean fertilization and the need to purchase special machines that filter out carbon dioxide from our atmosphere is of big discussion today. From more people being supportive of forest restoration, this can not only save the government tons of money each year but increase biodiversity.

From increasing the number of trees we plant, we can bring more wildlife to our earth and even provide valuable resources such as medicine and food.

In contrast, according to Robin Chazdon, the rapid speed of climate change can overthrow the benefits of forest restoration. Climate change can prevent many trees from growing normally and can cause an uneven distribution of trees in tropical areas.

In addition, some worry that forest restoration can reduce agricultural space used for farming and other human activities. This can have a negative impact on industrial spaces and prevent cities from generating income.

Moreover, it is recommended that politicians and scientists tighten their bond and create public initiatives that promote solutions for climate change and create awareness of the negative effects it has on our planet.


The Future of Fuel Cells

Key barriers preventing widespread adoption of the virtually emission-free fuel cells include high upfront costs, storage as well as distribution of hydrogen. Government-mandated programs and hybrid technologies could just be the catalysts needed to overcome these hurdles. 

Much like batteries, fuel cells convert chemical energy to electrical energy. The difference being, a fuel cell can produce energy as long as fuel and oxidants are supplied whereas typical batteries eventually go dead.

It is composed of a positively charged plate called a cathode and a negatively charged one called an anode separated by an electrolyte membrane. Its function is quite simple: 
  1. Hydrogen passes through the anode and splits into electrons and protons. Simultaneously, oxygen from the air passes through the cathode.
  2. The protons from hydrogen pass through the membrane whereas the electrons cannot. 
  3. Atoms need an equalizing charge, thus forcing the electrons to change their path, generating an electrical current (and excess heat).
  4. The hydrogen and oxygen then bond at the cathode to produce water molecules as by-products.

Fuel cells are a clean method of producing energy, with their only by-products being electricity, excess heat, and water.  Moreover, they do not contain any moving parts and therefore operate near-silently and are far more energy efficient than traditional combustion technologies. 

One of the largest vantage points of this source of energy is its scalability; individual fuel cells can form stacks which in turn can be combined into larger (usually stationary) systems used to power electric vehicles, hospitals, schools as well as  multi-megawatt installations connected directly to utility grids. But, their equally useful portable applications cannot be forgotten; they can power anything from phone/laptop batteries to hearing aids. 

So that brings us to the question that if they’re so great, why aren’t they being used everywhere? 

Some key barriers to their widespread adoption include their high upfront cost which cannot, as of yet, compete economically with traditional sources. 

Storage and distribution of hydrogen is another large hurdle we have yet to cross; this means that there is a lack of plug-in stations. One possible solution to this is to use methanol (a liquid fuel) in automobile fuel cells as it is easily transportable. The drawback being that it also produces polluting carbon dioxide. For these reasons fuel cells faced backlash from politicians (e.g. from the U.S. secretary of energy in 2009) further influencing funding cuts on research solutions from universities.

Provinces like British Columbia are mandating a new zero-emission vehicle (ZEV) initiative to be introduced later this year which would force automakers to allocate more fuel cell powered and electric vehicles to the province because real money is projected on the table. To further incentivize consumers, rebates of up to $6,000 are being offered for fuel cell cars. If enough measures are taken to increase demand then cost reductions can also be made through volume manufacturing.

Moreover, large fuel cell power companies like FuelCell Energy are entering into long-term power purchase agreements which delineate all financial technicalities with local distributors and as such, possibly eliminate the upfront investment hurdle. Hybrid technologies that combine fuel cells with gas turbines and engines to capture unreacted fuel is another emerging solution which improve efficiencies and affordability. 

According to a recent report by KPMG, hydrogen fuel cell electric vehicles have replaced battery electric vehicles as the no. 1 key trend in the automotive industry until 2025. Fleet vehicle operators like FedEx are exhibiting this by investing in hybrid hydrogen and battery power electric vehicles because the range is 166% farther than with batteries alone – increasing vehicle uptime while decreasing fuel and maintenance costs.

Fuel cells are not currently at the forefront of usage nor are they the newest developments in the energy markets. But they could be.

Wednesday, July 3, 2019

Two Earth-Like Exoplanets Detected Orbiting Nearby Star

Image Credit: Planetary Habitability Laboratory

The search for life elsewhere in the universe has just received another major boost. An international team led by the University of Göttingen has detected two planets orbiting the 24th-nearest star to the Sun. Teegarden’s star is a red dwarf situated around 12.5 light years away from our solar system, and is approximately eight billion years old. More importantly, it is home to two Earth-like planets, Teegarden b and Teegarden c.

Both planets are believed to be terrestrial (rocky) worlds. Teegarden b has a mass of 1.05 Earth masses, orbits 0.0252 AU from its star, and takes a mere 4.91 days to complete a single orbit. Similarly, Teegarden c has a mass of 1.12 Earth masses, orbits 0.0443 AU from its star, and completes one orbit in 11.409 days. Both planets are among the 19 most habitable planets known to science out of a total of 4000 known planets. In fact, Teegarden b has the highest ESI (Earth Similarity Index) discovered so far.

Although it is possible that both planets could host liquid water on their surfaces, Teegarden b is the favoured candidate for habitability. There is a 60% chance that it has a temperate surface environment, indicating a range of temperatures from 0 to 50°C. This temperature could vary based on atmospheric composition, with 28°C being the likely surface temperature if the planet has an Earth-like atmosphere. Contrastingly, there is only a 3% chance that Teegarden c has a temperature surface environment, with the surface temperature likely being around -47°C if the planet has an Earth-like atmosphere.

Although these initial findings seem promising, especially for Teegarden b, further study is required to determine the extent to which these planets are habitable. These planets were discovered using the radial velocity method, and are unfortunately non-transiting. This means that in order to determine other key characteristics such as radius, direct observation with a future telescope such as the James Webb Space Telescope may be required. As well, red dwarfs are known to emit violent flares, which could be capable of destroying the planets’ atmospheres and sterilizing their surfaces. Due to how close the planets orbit their star, they may be tidally locked, meaning one side of the planet would face the star at all times. This could create two extreme sides to the planet, rather than an overall temperature climate, rendering the planets uninhabitable. Follow-up studies will be required in the future to further assess the habitability of these two worlds.

Read the original research paper here:

Saturday, June 29, 2019

Study Finds Gut Bacteria Linked To Chronic Pain

Dr. Amir Minerbi and researchers from McGill University, UniversitĂ© de MontrĂ©al, and the McGill University Health Center’s Research Institute have introduced a new way to look at the effects of gut bacteria on chronic pain. 

Chronic pain is a characteristic symptom of a condition known as fibromyalgia. The research conducted found that within the microbiome of affected individuals, there were deviations in the size of 20 separate bacteria populations from that of healthy, non-affected individuals. This was observed by obtaining stool, blood, urine, and saliva samples from 156 individuals residing in Montreal. Of those 156 individuals, 77 of them were diagnosed with fibromyalgia. 

It was detected that fibromyalgia and its associated symptoms - some of which include altered sleep, cognitive issues, and fatigue - were among the highest contributors to the imbalance of bacterial populations, and that the more heavily imbalanced the bacterial populations were, the more intense the display of the symptoms.

To cross out any other possible factors that may be altering the microbiome of  individuals with fibromyalgia (such as diet, age, and exercise), a varied collection of techniques were used, which included artificial intelligence. 

Hopes for the future of this finding include sampling a cohort with perhaps a more diverse geographical range and to extend this research to animal studies to test the relationship between bacterial changes and the progression of fibromyalgia. 

Read the full story at: 

Image Credit: Dr. Amir Minerbi

Extra Resources:

Sunday, June 23, 2019

Artificial Intelligence and Deepfake Videos: A challenge society faces today

Artificial Intelligence and Deepfake Videos: A Challenge Society Faces Today

The rise of artificial intelligence in the 21st century can have its peaks and pits, but what keeps our society attracted to technology is the power it holds in bringing efficiency. This includes fuel efficient cars, new kinds of television screens, real-time language translation and much more.

However, with the rise of advanced technology, one can forget their limits and employ social media to mislead people. Recently, multiple cases that have been brought to attention include the harmful use of “deepfake” technology, whose primary purpose is to apply artificial intelligence to fabricate images and voices of celebrities and well- known officials. This can then disseminate misleading information and be more of a problem for politicians, especially those with a large social presence.

Notably, according to a report by Emily Tillet of CBS news, a falsified video of the United States House speaker and democrat Nancy Polesi appearing impaired caused a massive social media outbreak. This video was popular on Facebook - with over 2.5 million views. This scam was considered absolutely unacceptable according to many United States House Representatives and government officials as it was a breach of privacy and falsely portrayed an important political figure.ome democratic officials and the public are questioning the credibility of certain figures and if what the video showed was true; her position as house speaker is jeopardized and its potential impact on national security is being considered. Now who was to blame for all of this?

According to CBS News, Tech mogul Mark Zuckerberg offers his apologies for keeping the video on Facebook for a long time. Although it wasn't removed, Zuckerberg ensured that it was less visible in people’s feeds. Being a target himself, he understands the effects of deepfake videos. According to an article by BBC News, a deepfake video earlier this year featuring Mark Zuckerberg with a doctored voice and falsified words was created by two comedic artists, Daniel Howe and Bill Posters. They wanted to promote a new form of advertising that persuaded people to share rather secretive information about their personal life. The purpose of creating such videos was primarily for comedy and not with the intention of hurting anyone.

However, a key public concern is the negative affect such videos can have on the upcoming 2020 presidential elections. If deepfake videos continue, many important political candidates can be humiliated and this can affect individual vote numbers.

Therefore, to prevent such events from happening, the United States House Committee Chair Adam Schiff and other lawmakers suggest that social platforms impose strict restrictions and employ better policies in regards to the type of content people can create and post.They also ask that the government employs stricter laws that prevent harmful deepfakes from being shared across the internet. From doing this, the more people will become aware of deepfakes and will understand the consequences it can have on thousands of people. The internet can then become a more safe and respectable place to surf information.