To Treat Toxic Wastewater Without Chemicals, Scientists Develop Nano-Material From Seaweed

Treatment of wastewater containing industrial dyes and toxic heavy metals is a major environmental problem as available treatment techniques are not very efficient and environment-friendly. Now, a team of Indian scientists has developed a nano material drawn from seaweed for effective treatment of toxic wastewater without using any chemicals.

Membrane-based filtration processes are generally used to treat industrial wastewater but they can’t fully filter out heavy metal contaminants. In order to address this problem, processes that use activated carbon, graphene or carbon nano tubes are being developed as carbon-based processes can help remove dyes and heavy metals through adsorption.

Researchers at the Central Salt and Marine Chemicals Research Institute, Bhavnagar, have gone a step further to make carbon-based cleaning process fully green by using seaweed as starting material. They have synthesised graphene-iron sulfide nanocomposite from abundantly found seaweed – Ulva fasciata – through direct pyrolysis technique.

Seaweeds are known as carbon sinks. In some earlier studies, biomass of Ulva fasciata has been directly employed for adsorbing copper and zinc ions from water but the uptake capacities were relatively low. This problem has been overcome by deriving thin carbon sheets from seaweed at very high temperature. These graphene sheets are doped with iron. The nanocomposite obtained from seaweed has shown very high adsorption capacity for various cationic and anionic dyes as well as lead and chromium.

The nano composite can be used in up to eight cleaning cycles, with only nominal loss of its adsorption capacity. Even mixed dyes could also be adsorbed. A maximum adsorption capacity of 645 mg per gram for lead was achieved at neutral pH. This is the highest ever reported for any biomass derived carbon material, scientists have claimed in their study published in Journal of Hazardous Materials. It could also remove highly toxic hexavalent chromium from wastewater.

“Presence of high concentration of salts had negligible effect on the adsorption properties of the nano composite, making it a suitable candidate for the pre-treatment of highly contaminated waste waters,” explained Dr. Ramavatar Meena, who led the team, while speaking to India Science Wire.

Cloud migrations: Don’t settle for just some operational savings

I’ve stopped thinking of simple migration to the public clouds as “success.”

Yes, businesses do benefit. You decrease operational costs by a certain amount, if you plan correctly, and you certainly increase the convenience of not having to deal with hardware and software. But all that gets you to just 10 to 20 percent in savings.

And that savings has a big price tag: Migration projects are very labor-intensive, and they often run into issues such as internal politics, cost overruns, and compliance issues as you’re looking to drive platform changes.

Moreover, you have to consider the cost of risk. If you bother to calculate it, the risk is high considering the issues I just mentioned, and that could remove any benefit gained for at least a few years.

Of course, I am not arguing against migration to the cloud. But mot enterprises need to think more deeply as to why they are migrating to the cloud, and then how.

Unfortunately, most enterprises consider cloud to be a tactical technology, and the CFOs and CEOs are glad to see the cost reductions. But if the use of cloud computing is not transformative to the core business, it’s really not providing you the ROI you seek.

“Transformative” means that you leverage the innovation and disruption that cloud computing provides. For example, a car company that can remove all friction from its supply chain by using cloud-based technologies, or a bank that can finally use its systems to gain access to key customer data that lets it provide better products and increase market share.

These are tricks we’ve done with technology for years, but the cloud removes much of the complexity and cost from having to on-board these technologies with traditional mechanisms. For exmple, in the cloud, you can access—within a few hours or even a few minutes—machine learning technology and advanced analytics, as well as databases that can store many petabytes.

The agility aspect of cloud computing is another clear benefit that most enterprises don’t consider, but it’s a key reason why many businesses remain with the cloud.

The transformative nature of this technology makes it an effective weapon for owning your market. Doesn’t that sound better than a 10 to 20 percent cost reduction?

Important Digital Marketing Metrics for CEO’s

Before the advent of big data, marketing functions were largely focused on creating brand awareness through mass-market promotional efforts. Usually, these campaigns would be directed by gut instinct rather than any sort of quantitative analysis. As a result, it was often impossible to derive any insight into the sales growth they brought to the business. Understandably, the lack of clear value offered by these campaigns would often leave CEOs wary about investing in any further marketing.

Over the past decade, however, the landscape has undergone a seismic shift. Gone are the days when scattershot advertising and vague measurements could produce a successful marketing campaign. The introduction of multiple new digital touchpoints in the customer journey has created a far more complex sales cycle and brought a massive influx of data into the enterprise alongside.

In order to make sense of the sheer volume and variety of information now available, CEOs must pay close attention to the marketing metrics which make a real impact on the profitability and competitiveness of their organizations. Here some key digital marketing indicators that every CEO should be keeping an eye on.

Customer Acquisition Cost (CAC)

As the name suggests, this metric denotes the average cost of acquiring a new customer. It is calculated by dividing the total marketing and advertising costs (including salary) incurred over a specific time period, by the number of customers acquired over the same period. The ability to directly relate the number of customers acquired to a specific marketing investment has only become possible very recently with the arrival of user tracking tools that can follow a prospect from lead to revenue.

For CEOs, this metric provides the most effective measurement of investment against return, showcasing the actual revenue created by the company’s marketing efforts. If the amount of value generated by promotions and advertising isn’t in line with expenditures, then the campaign will need to be either optimized to create greater sales, or scrapped altogether.

Customer Lifetime Value (LTV)

According to research group Gartner, a mere 20% of repeat customers account for over 80% of business profits, which just goes to show how important long-term customer relationships can be for any organization. In today’s hypercompetitive business marketplace, organizations are forced to dedicate more and more resources towards customer service and retention programs. The efficacy of these investments is clearly shown through lifetime value metric.

The easiest way to calculate the LTV is by taking a sample of customers, and recording their average spending over a set period of time, minus the gross margin. Divide this figure by your retention rate over the period (subtract the number of customers at the end of the period, from the number of customers acquired during the period, and divide by customers at the start of the period). Divide the resulting LTV figure with your CAC, and you can compare the value created by the customers over the period, versus the cost incurred to acquire them.

A value greater than three generally indicates a high ROI on customer investments, but anything over this figure may point towards unfulfilled opportunities for further growth.

Website Traffic

On their own, website traffic figures reveal little about the success of your marketing efforts, after all, without context, a click will not tell you anything about the purchasing intent of the user. However, if you’re able to track your visitors based on traffic sources, then you will be able to develop a much clearer image of which investments are yielding the greatest return.

By identifying whether prospects came to your site via an organic search engine result, a paid advertisement, a social media link, an email referral or a Youtube video you can work to optimize your clearest path to creating conversions.

Conversion Rate

For digital campaigns, a conversion rate measurement based on the number of sales created through referral links will give you directly attributable information about the ongoing success of the campaign. To keep track of this figure, simply divide the number of website visitors gained from the marketing effort, by the number of sales conversions created through the CTAs on your landing page.

Social Media Engagement

Unless you run a massive multinational corporation, and employ every analytics tool known to man, it can be extremely difficult to track conversions directly to social media marketing efforts. According to new research from eMarketer, less than 20% of B2B marketers are able to measure the ROI of their social media campaigns. Simply put, the analytics data provided by these platforms is insufficient to foster any real understanding of the revenue generated by social media marketing.

Yet, any organization would be foolish to overlook the importance of social media in interacting and communicating with established and prospective customers. So how can you derive any insights into the value of your social media presence? Through the only clear metric, you do have available, that is social engagement.

By keeping track of the likes, shares, retweets, and subscriptions generated by your content, you can gauge how successful your online engagement strategies have been in transforming customers into full-on brand advocates.

Database decisions: AWS has changed the game for IT

You may not have heard of OpenSCG, but Amazon Web Services has. A week ago, AWS quietly acquired the PostgreSQL migration services company founded by PostgreSQL veteran Denis Lussier. While some PostgreSQL fans weren’t happy about the move, the OpenSCG acquisition is emblematic of a much larger move by AWS to serve a wide array of database needs.

At the recent AWS Summit, Amazon CTO Werner Vogels said as much, declaring that “what makes AWS unique is the data we have, and the quality of that data.” Taking a slap at Oracle in particular, Vogels derided the “so-called database company” for offering far fewer relational database services than AWS, and just a fraction of the array of database services that AWS offers (including NoSQL offerings).

With more than 64,000 databases migrated to AWS in just the last two years, AWS looks set to hold even more enterprise data.

AWS doesn’t tend to announce its acquisitions. They’re invariably small, not triggering any legal requirements to announce them, and while some companies acquire so they haveproducts to sell, AWS only acquires complements to the services it builds in-house.

Nor is it surprising that AWS would be interested in the PostgreSQL sponsor. As one Reddit commenter mentions, “True PostgreSQL expertise is difficult to come by and OpenSCG has a lot of it. If you combine that with Amazon’s clear support of deploying Postgres-related products (RDS/Aurora/Redshift) and its message of #DatabaseFreedom, … it becomes pretty clear why AWS was interested in OpenSCG.” Although OpenSCG has been an AWS partnerfor some time, OpenSCG has particular expertise in helping companies migrate to PostgreSQL.

Which is, of course, perfect for an AWS that is intent on moving orders of magnitude more database workloads than the current 64,000 to AWS.

AWS seeks to be the “every database” store

Not all those database workloads involve PostgreSQL, of course. Although the open source database has experienced a renaissance of popularityover the last few years, it’s just one of the various databases that AWS supports. AWS has been aggressively decomposing applications and infrastructure to give its customers the specialized services that let them develop what they want, Vogels says, “instead of AWS telling them what they must develop.”

You want PostgreSQL? AWS can help with that. How about a NoSQL database with infinite scale and predictable performance? AWS has that, too, with DynamoDB, but also through partners like MongoDB that run a large percentage of their workloads on AWS.

The list goes on.

And on.

All of which leads to the question “What does this mean for IT’s database decisions?”

The database choices aren’t like they used to be

Oracle and Microsoft’s trump cards to date have been that they collectively own three of the world’s most popular databases, including Oracle, MySQL (owned by Oracle), and Microsoft SQL Server. As data has changed, however, these trump cards have lost some of their luster, serving as an almost unwelcome crutch at times. Oracle has missed the market transition to big data applications.

By contrast, Microsoft has not rested on its laurels, releasing a spate of database options, including CosmosDB. Although Microsoft Azure has fewer database alternatives than AWS, it’s a strong No. 2 to AWS’s leadership position. So far, developers have preferred AWS’s approach, which is to offer maximum database choice, fitting particular databases to specialized needs. Even so, Microsoft at least has a credible strategy.

Oracle, by contrast, has spent years ignoring or deriding the cloud, then basically fork-lifting its database to the cloud. A year ago, it made the silly move of trying to raise the price of running Oracle on AWS, hoping to get customers to defect from AWS and run those workloads on Oracle’s struggling cloud. It hasn’t worked.

Nor will Oracle have much hope if AWS continues to move more database services into its arsenal of server-less functions. As industry expert Simon Wardley posits, “As Amazon’s server-less ecosystem grows, the more metadata it can mine, the faster its rates of innovation, customer focus, and efficiency. Once it gets to around 2 percent of the market then it’s game over for all those not playing at scale.”

Microsoft (and Google) are sprinting to add database services, including serverless options. Oracle keeps muddling through a 1980s way of thinking about the database, and it’s going to cost the database hegemon its lofty market position.

Meanwhile, AWS keeps steadily building out the database services developers require for next-generation applications, all while improving its abilities to migrate existing workloads to AWS.