How to Generate Leads for Your Website

posted on August 6, 2021


Getting great leads is crucial for a successful business! Watch to learn different tips and tricks to make sure your website is bringing in great leads! 0:00 – Intro 0:46 – What should I do before looking for leads? 1:33 – Tip #1 Optimize your website 1:52 – Tip #2 Optimize social media profiles 2:10 […]

Configuring your website in WinSCP

posted on July 22, 2021


Configuring your website in WinSCP *********************************** 👉 Facebook:… 👉 Twitter: 👉 LinkedIn:… 👉 Instagram: Please consider subscribing to our YouTube channel to see more content like this! 😄 *********************************** Canadian Web Hosting is a Vancouver-based web and cloud hosting company. We specialize in hosting business and enterprise-class clients from around the […]

Build a Website Within an Hour with InMotion Hosting’s WordPress Website Builder

posted on June 16, 2021


Build a website in under an hour without code! Our WordPress Website Builder makes it easy to launch your website to the world with ease. Best of all, our Website Builder is free with the purchase any WordPress Hosting Plan. Choose from hundreds of themes to get started, then use our intuitive drag and drop […]

Four Types of Remarketing

posted on June 11, 2021



Marketers know the majority of site visitors leave without taking any action. These people can be encouraged to return to the site with the help of traditional or dynamic remarketing.

Nowadays, however, the Internet is full of advertising, so it’s vital for each campaign to stand out from the crowd by using the most relevant and useful content. To achieve this, you can use the incredible potential of special types of remarketing.

The aim of this article is to expand your knowledge of remarketing possibilities and explain new ways to use remarketing to reach your goals.

Multi-level remarketing
Thanks to this function, you can show users different adverts promoting new offers, and even change the frequency of ad impressions, depending on the number of days since their last visit to your site. Such tactics are suitable for products or services that have a long sales cycle or trial period.

For example, we have a list of people who have bought a printer in our store. Our goal is to show them a banner with an offer to buy ink after three months. To do this, we’ll create one remarketing list with a 90-day duration, and an identical list with a 120-day duration. In the next step, we’ll combine these two audiences: choose the ‘Custom combination’ option from the drop-down menu.


Then we add the audience with the longer period of time by using the ‘Any of these audiences’ condition. The ‘None of these audiences’ condition will help us exclude the audience with the shorter period of time.


In the last step, you should create a new campaign or ad group for this new audience. As a result, our ads will start being displayed 90 days after the visitor bought a printer at our online store, and these ads will follow the customer for another 30 days.

Similar audience
Similar audience is a special kind of remarketing audience that includes people who are not visitors to your site, but their behavior and interests are very similar to those of your customers.
Using similar audiences is a simple and effective way to increase your traffic.

AdWords will record views of sites in Display Network over the last 30 days, and this information will be used for the automatic selection of a new group of potential customers within the network who have similar interests and behavior.

Google makes remarketing lists for creating similar audiences by itself, so lists with similar audiences will automatically appear next to your remarketing lists in Shared libraries. Please note that you won’t be able to see such lists if your remarketing audience contains cookies for fewer than 500 visitors:


We recommend creating a separate Google Display Network campaign for similar audiences, because quality indicators for these ads could be lower than for the traditional remarketing campaign.

Remarketing lists for search ads
With the help of traditional remarketing lists, you can display ads with special content or adjust your bids depending on the category of your site visitors.

To make your search ads visible to users from your remarketing list, the list should contain cookies for at least 1,000 visitors. You can check this from the Audiences tab in Shared libraries.


There are several strategies for using remarketing audiences for search ads:

  • Add an audience to the existing campaign and increase bids for recent visitors to your site;
  • Create a new campaign with a specific audience and adapt the text to your needs (offer a promotional coupon, promote special deals and so on);
  • Exclude past visitors to your site who are still interested in products similar to yours. It will help to attract new visitors only;
  • Target your ads at previous buyers who are interested in related products, or at high volume keywords that haven’t been used in a main advertising campaign.

Remarketing for YouTube viewers

If your brand has its own channel on YouTube, you can use remarketing for video viewers.

With the help of AdWords for Video, you can create remarketing lists of users who have completed one of the following actions:

  • Interacted with or viewed your YouTube videos;
  • Subscribed to or unsubscribed from your channel;
  • Viewed your TrueView in-stream ads;
  • Visited your channel.

To create a video remarketing list, your YouTube and AdWords accounts have to be linked. If you haven’t made advertising campaigns for video, please follow the guidelines at: To create a remarketing list for YouTube viewers, go to the navigation panel and choose the ‘Video Remarketing’ option in Shared Library, click ‘+Remarketing list’, then choose a new remarketing list type and fill in the other fields.


You will be able to manage your lists from the ‘Targets’ tab of the video campaign.

As we promised at the beginning, we will now explain two ways to optimize your remarketing campaign.

One of the options is to exclude placements with low CTR. On some sites, advertising banners aren’t placed in the best position, which is why users rarely spot the ads. To exclude such low-performing placements, go to the Display Network report, find the Placements tab and analyze the CTR, bounce rate and conversions.


Highlight placements like those shown in red, click ‘Edit’ and select ‘Exclude.’



Before you exclude large platforms such as YouTube, remember that you can only exclude individual URLs on the site. To identify such URLs, select the site and go to the See details menu, then choose the ‘Selected’ option.


You can also add to the list of excluded placements those sites where visitors have a high bounce rate (this information can be obtained from Google Analytics).

The second method of optimizing your remarketing campaigns is to exclude certain categories of sites, so you avoid displaying your ads next to any content that you would not like to be associated with your brand. By this we mean different kinds of sensitive content, mobile apps, games or social networks. By default, only gambling is excluded.

You can find this option by going to the ‘Display Network’ tab, then scrolling down to the bottom of the page and clicking on the green ‘+ Options’ button.


You can also turn off showing your banners below-the-fold here.


As you can see, remarketing is an evolving discipline, which provides plenty of opportunities for smart advertisers. To take advantage of the benefits of remarketing, you simply need to determine which types of retargeting are most suitable for your business, then add them to your traditional remarketing campaign.

Why is Japan’s handling of this pandemic so bad compared to their neighbors?

posted on June 11, 2021


I think the wealthy nations that fared the worst have something in common: Italy, Spain, Japan, Canada, the US, Brazil, Sweden in the early days, all treated the economy and health as being opposing priorities, and tried to get away with half-assing both.

The ones that did the best are the ones that had strong, swift lockdowns and made an effort to offer even-handed, if not always adequate, ongoing financial aid in the meantime.

Countries like Canada and the US and Japan, which applied ridiculously convoluted means testing and conditions on financial aid, saw undue politicization and partisanship, extreme processing delays, uneven distribution, extinctions of entire industries, and, most pertinent to your question, increased unwillingness among the public to cooperate with health measures.

If you give people a choice between potentially getting sick or definitely losing their jobs, they’ll typically choose the former. And then they’ll retcon whatever narrative it takes, even if it’s a crazy conspiracy theory, to manage the guilt.

To give you a sense of what I mean by a surge, here are Canada (where I live) and Japan on as of today (May 11):

There are no excuses for this. I think I should clarify, for those interested (it’s a long read), why, despite the low death toll and apparently low infection rate, I agree with the premise of the question, that Japan is performing poorly. As I’ve mentioned in several of my comments:

The people of Japan have sustained relatively well, all things considered. The lack of testing can hide case counts — and it’s definitely doing that — but a high death toll is harder to cover up, especially for a disease that’s so deadly that in many places it has become the number-one killer. That might be because of the ubiquity of masks, physiology, lack of handshaking and hugging, prior exposure to similar viruses, the vowelly nature of the language, or some other accidental or semi-accidental reason. In any case, it wasn’t the success of a national response to the pandemic.

The nation of Japan has utterly, catastrophically failed its people, and the evidence of that is in the overwhelmed medical system and the economic turmoil. The complete, deliberate refusal to develop an exit plan (a strategy which they call “with Corona”) is not only putting Japan at future risk, it is also causing current, palpable desperation, reckless defeatism, panic, fear and distrust of important institutions and of other people. That’s all very real damage.

Keep in mind that covid started with one patient. Covid’s growth, being that it infects more than one person per person, is exponential. Treatment for potentially fatal covid cases requires hospital resources, of which there’s only ever a more or less fixed amount — the discharge count can only grow linearly; the remainder die. An exponential minus a linear is exponential, which is to say that, even if things seem safe now, it can easily go horribly wrong overnight. The only way to fight a phenomenon of exponential growth is to nip it in the bud or to be ready to mobilize en masse in case it does explode — that is, to have the one thing that can outrun exponential growth: national policy. Japan has done neither.

Hospitals in many parts of Japan are, or are at risk of being overwhelmed. This is the beginning of a vicious cycle observed throughout the world, in which nurses and doctors become exhausted or themselves ill and quitting in droves, which in turn leads to poorer treatment both for covid as well as for other treatable illnesses, which in turn leads to worse health outcomes, which leads to more suffering, more sick people, and even more expensive treatment in the form of emergency room visits, palliative care and the need for more precious facilities such as respirators.

It’s only while the number of people being admitted to hospital is lower than the full capacity of the medical system that an illusion of success against the pandemic can been maintained. Most of the world learned that lesson in 2020, either through the news or through experience.

The economic suffering is profound. 1 in 5 female students are having trouble affording feminine hygiene products; even as of July of last year, the unemployment rate has been the highest it’s been since the end of the war (and lower-than-minimum-wage gig workers are counted as having jobs); 60,000 homeowners are having trouble paying their loans. It’s clear that even if you don’t see poverty on the streets every day where you live, it is quite severe now as a result of covid. There are too many such symptoms to list them all here, and the on-again, off-again, vague, symbolic, ineffectual states of emergency are only prolonging the suffering.

Until recently, every time the case count grew, it was in areas opened to Abe’s idiotic Go To Travel campaign, which had people supporting the tourism and service sectors by traveling for fun in the middle of a pandemic, in the winter. Japan, Germany and Italy have borrowed by far the most per GDP on covid, and yet people were forced to resort to this kind of insanity. Personally, I don’t lose sleep over the risks of overspending (Japan’s national debt has always been terrifyingly high, and yet it’s never seen very high inflation), but this should be a clear indication that something is wrong with the spending that is taking place.

Finally, the Olympics: the government and the IOC are dead-set on going through with it at any financial cost, public opinion cost or human cost, entirely out of sunk cost fallacy and in service of avoiding relatively small cancellation fees. A solid 80% of the nation opposes the Games, 70%+ of businesses report they will likely see little or no detriment in its cancellation, and it is already causing more outbreaks. Hokkaido saw the nation’s worst daily case number just recently, two weeks after Olympics rehearsals were held there. Nurses are being pulled out of hospitals to service the Olympics, and 70% of volunteer nurses in Ibaraki prefecture have already resigned

Granted, pressure from the IOC is largely to blame for this situation, but it’s quite clear that Japan is very much in a position to kick them to the curb. But the Japanese government wouldn’t know that, given they brazenly admit to never even having read the fine print on the matter.

By the way, I’m not letting Canada, the other nation with which I identify, off the hook, either. Canada hogged 9 times as many vaccine doses as there are people, presumably in an effort to hold for ransom the health of people in less affluent nations as well as global efforts to address the pandemic. Believe what you will about the importance or safety of vaccines, but I don’t know how to describe this behavior other than homicidal.

And yet numerous cities are still in disarray as of now (May 23), and vaccinations are only now ramping up, though quite sharply, as demonstrated by the 45% first dose vaccination rate vs. the measly 4.2% second dose.

Trusted Link Building Techniques

posted on June 9, 2021



Internet marketers are always searching for the latest, greatest technique to boost numbers. While it’s important, especially in digital marketing, to always keep one eye on what lies ahead, it’s also equally (if not more) important to remain focused on what has and currently is working for you.

When it comes to link building, lots of things have changed over the years. As search engines have cracked down on spammy and automated techniques, it’s important now more than ever to be using trusted and true strategies that actually work. While there are lots of new creative link building techniques, there are also the powerful strategies that have been around for sometime and are continuing to produce great results.

So, what are the five most trusted link building techniques that you can count on in 2015 (and beyond)? Check them out here.

Guest Posting

The most tried and true of all techniques lives on!

Yes, it’s 2015, and yes, guest posting is still one of the most tried and true link building techniques. Don’t let people tell you guest posting is dead, because it’s not. While it’s quite clear and obvious that guest posting has significantly changed from what it used to be, it’s by no means dead, and in fact it’s alive and well.

In terms of how guest posting has changed, it’s in fact become a much more legit form of link building than it used to be. While what once might’ve been easily automated and done in large scale numbers is now pretty much unattainable unless completed on a more personal level. Conducting real outreach to quality blogs, with real accomplished writers and journalists, is how you’re going to get links in today’s digital marketing climate.

Guest posting in 2015 is less about the link you’re getting, and more about the quality blog post you’re posting. The link is an added benefit!

How-To Guides

Become an information powerhouse, and get links while you’re at it!

Creating informative, niche how-to guides and hosting them on your website or blog is another great step to take in order to start gaining valuable links. It might take a significant amount of time or a lot of effort to create a guide valuable enough to really help you get good links, but the time and effort you put in will come back to you tenfold.

Using videos, images, quotes, and more in your how-to guides is imperative if you want to see them turn into link building powerhouses. Once you’ve built up a solid guide (or, for best results, multiple guides) it’s time to start promoting them. Conduct blogger and influencer outreach to a variety of different outlets, and share the guide with them. Share it constantly on social media, and be sure to target niche specific people and blogs.

Building how-to guides is a link building strategy that, with enough effort put in now, will pay off for many years to come.

Expert Interviews

Be the source of expert information, and the links will follow.

Conducting interviews with experts in the niche you’re link building continues to be one of the greatest ways to gain a high number of valuable links. While it’s a strategy that requires a lot of waiting, and a lot of time to put together and finally get live, it’s something that is almost a sure-fire bet to get a number of great links while at the same time networking with some of the more known names in the industry.

Craft a personal email with a number of questions for the expert that you’d like to interview. If you want this piece to just focus on one expert, that’s fine, but ideally you’ll craft a blog post or content page with quotes and analysis from a handful of experts, as this will likely get you the most links. Once you’ve collected the answers and quotes, and have packaged the content together, reach back out to the expert with what you’ve come up with. If you put a lot of work into it, it will show, and you’ll likely get some social shares and a link back to your work.

Consider reaching out to many more experts than you want or think you’ll need, because chances are many will flat or ignore your emails for whatever reason.

Event Sponsorships

Look to find sponsorship opportunities in your niche, and you’ll get relevant links.

Sponsoring events is another excellent link building strategy that companies across many niches are utilizing in 2015. Depending on the size of the event and level of sponsorship, the potential of gaining a huge number of quality links is substantial. Events such as conferences typically list every single sponsor, and as other blogs start writing about the event there’s a chance you’ll get some links back that way too.

While sponsoring events is a great way to get links, it’s important to keep in mind that there is a chance these links will eventually get removed. Use various search queries to find good events that you can sponsor.


One of the more up and coming link building techniques that more and more companies are starting to utilize is offering scholarships. Offering scholarships sort of follows the same idea as sponsoring an event, but scholarships will likely result in more links from more higher quality sites (and possibly for a cheaper price, too).

There are a myriad of locations out on the web that compile lists of scholarships available to high school or college students, as a helpful resource. Sponsoring a scholarship by offering, say, $1,000 to the winner will help your company appear on these lists. You build a page on your website with all the information needed to apply for the scholarship, and promote that page out to the scholarship aggregators.

While the initial startup cost of offering a scholarship might be high, chances are you’ll see a nice return on investment due to the number of high quality EDU links you’ll be getting.

What works for you?

All in all, these five strategies have worked for businesses small and large, for many years. Unless something dramatically changes, they’ll continue working in the future, too. But, just like with any online marketing strategy, it really boils down to what works best for you and your business, as it might vary dramatically from another business!

Article Marketing Kept My Websites Thriving

posted on June 7, 2021


You may even spend days on end tweaking the content on your website to make it more SEO-friendly, never really knowing if what you are doing will have any impact at all.

That is the biggest problem with search engine optimization in 2016. It is much harder to draw a straight line between the SEO activities you undertake and your rankings in Google’s search results.

When Google was still young, we could play SEO games all day long and move our websites up and down Google’s search results with simple changes to our websites.

But Google has matured a lot in the last several years, wiping out most of our ability to make simple changes to see massive results.

I was playing Google like a fiddle in the 2000s. I knew exactly what to do, and I had the skills to do it, to push anyone’s website to the top of Google’s search results. I was so good at what I was doing that I was even able to offer my services as a search engine optimization provider and collect monthly fees in the range of $18,000 per month.

Google killed my business in an afternoon by announcing it was going to kill paid links in their search algorithms.

Now, I tried to explain to my clients this announcement would have no effect on what I was doing, because I wasn’t paying anyone to place links to their websites.

But to no avail… each of my clients said, “We are paying you and you are creating links for us, so Google is talking about us.”

I am so glad I got out of the SEO-industry in 2008, because clients would freak out every time a Google employee sneezed.

Because I knew my clients were wrong, I continued to work my magic with my own websites, and I continue to benefit from those activities even today as 2016 is winding down.

I closed another of my websites in 2010. I didn’t shut it down, but I quit adding new content to it and promoting it actively. I also took down all of my buy buttons. We are coming up on the seven-year anniversary of when I stopped supporting that website. This particular website still received traffic from 142,000 unique visitors in 2015, and it is on track to match the same traffic levels in 2016.

The bottom line is, that I have been using article marketing to promote my websites online since circa 1999.

Article marketing works as well today as it did 17 years ago.

The Day Article Marketing Died

If you are like me, you might have heard about “the day article marketing died” – Feb. 23, 2011.

That was when Google introduced the Farmer/Panda updates, and lost 90 percent of its footprint inside of Google’s search results. lost 94 percent of its search rankings, lost 87 percent of its search results, lost 85 percent and lost 90 percent.

Google pretty much decimated all of the article directories in a single day.

The article marketing industry was turned on its head, and many providers in this niche lost their shirts.

Article directory marketing died, and it has never returned to prominence, despite hundreds of companies trying to hang on and find ways around the embargo.

But Here is the Thing That Might Surprise You

The traffic to my website was barely affected, because I had never really used the article directories to promote my articles.

My articles were published on websites, inside blogs and newsletters. The Farmer/Panda updates did not impact the websites where my articles were published. In fact, those properties likely benefited from the death of the article directories.

My articles were published in places where a human-editor chose my article from a stack of articles on their computers.

Online publishers who are driven by a need to keep readers happy will focus only on publishing articles that they have reviewed and decided were a good fit for their publications and their audiences.

This is the Article Marketing Secret Sauce

In order for your article marketing activities to be successful, you need to be able to get your content placed somewhere where a human-editor is reviewing your article for placement in their website.

The secret sauce is “human reviewed and audience approved.”

With the advent of social media, “audience approved” is easy for Google to identify. If people enjoy an article, they will share it on Facebook, Twitter, Google Plus and Linked In.

If Google is able to see how often a particular article has been shared on social media, it knows that the particular article offers a lot of value to its readers.

The Proof of This Idea Can Be Found in Mashable

Founded in 2005, Mashable is the top source for news in social and digital media, technology and Web culture.

According to its advertisers page, Mashable has 45 million monthly visitors, and it has 29 million followers on social media. Mashable also indicates that one of its posts is shared on social media every three seconds, and 55% of its traffic comes from mobile users.

I took a few minutes to scroll through Mashable’s front page to look for articles on its website that might stand out in Google’s search results. Here are just three examples:

  • I found an article titled, “People Were Nuts About Guns This Black Friday”. This article had 490 social media shares. I Googled “people nuts about guns” and Mashable was the No. 1 listing. “Guns Black Friday” was also ranked in Google showing Mashable in the No. 8 listing.
  • Another article I found was called, “Online Sales for Black Friday and Cyber Monday Broke All Records”. It had 547 shares. I queried “online sales black friday cyber monday” inside Google without the quotes, and Mashable was the No. 9 listing.
  • The final article I checked was titled, “Samsung Brand Autopsy: How Can The Company Earn Back Trust?” It had 513 shares. When I queried Google for “Samsung brand autopsy” and “Samsung brand”, the story inside Mashable was ranked No.1 in Google. Even a search for “Samsung company” shows the Mashable article in the No. 8 spot in Google’s search results. is ‘willing to take’ ( articles from people like you and me.  To increase the likelihood of seeing your articles published on Mashable, I encourage you to read the article ‘12 Things Not to Do When Pitching a Story to Mashable’ located here:

As seen above, Google has proven that article marketing isn’t actually dead to them. Article directories are dead, but article marketing is not.

Because That Is Where The Money Is

During his 40-year criminal career, Willie Sutton stole approximately $2 million and spent half of his adult life in prison for his crimes.

When asked why he robbed banks, he said “Because that is where the money is.”

Which brings us to the question of why I have always used this kind of article marketing to promote my websites… “Because that is where the traffic is.”

I don’t worry about doing SEO for my own websites. I worry about getting my articles into a website that already has tons of traffic.

For several reasons:

1. I want my share of their traffic.

2. If their readers share my article on social media, then Google will share my article with more of their users.

3. If social media shares impact how Google perceives the value of my article, on a major website, then that value will transfer also to my website via the link in my author’s resource box.

4. A few really valuable web pages linking to my website trumps thousands of links from low-value web pages.

In Conclusion

There is nothing really hard about this approach to search engine optimization.

All you really need to do is to focus on creating content that people will want to read, then put it in a place where lots of people already go to find the information that they want to read.

What are some easy YouTube SEO hacks?

posted on June 7, 2021


SEO — which stands for search engine optimization — means the process of maximizing the number of visitors to a particular piece of content, by ensuring that the site appears high on the list of results returned by a search engine.

In a nutshell, SEO is the process of actively trying to get your content in front of the right audience. Even better, you want them to see your content when they happen to be searching for education on products or services similar to yours. This is achieved through the use of strategic keywords, as well as a few other factors we’ll discuss a little later.

How does SEO work for YouTube Videos?

YouTube sees more than 3 billion searches per month.

And there are a number of factors at play in the YouTube results algorithm, including video length and engagement rates. But SEO has a massive role to play, too.

Achieving great SEO rankings for your YouTube videos is a strategic process that involves optimizing your channel, playlists, and metadata descriptions for specific keywords. It also involves editing your videos themselves to follow specific SEO best practices.

Let’s round things off with a few bonus hacks to help your YouTube SEO soar:

Keep it long

YouTube was built to compete with television (and its advertising rates). Therefore, long high-quality videos tend to perform better in searches. Videos should last at least five minutes and include great information on your topic.

Share to social

Don’t miss out on the opportunity to share your video across your social networks. Not only will this drive users to your videos, but the backlinks will help the search engines index your content quicker.

Ask for the subscribe‍

At the end of your video, a simple one-line call to action can dramatically help you build your channel: “If you enjoyed this video, hit subscribe!” This request reminds people who like your content that they can follow you and hear more of the good stuff.

Where can I buy traffic to an affiliate link from ClickBank?

posted on June 7, 2021


I would feel remiss if I didn’t answer this question because it shows a HUGE flaw in this question of eleven words.

You DO NOT under any circumstance send paid traffic directly to a Clickbank affiliate link, YOU WILL lose your money. There are no if, and, or buts about it.

If you’re talking about building some sort of funnel and then sending traffic to that, then go right ahead- it’s a good ideal. But sending paid traffic to a Clickbank affiliate link is a huge mistake.

Here, let me walk you through a scenario:

You’re promoting a Clickbank offer that will earn you a 100% commission of $50. You go and buy $300 worth of clicks from a solo ad dealer. Let’s even put each click at $0.40, the lowest I’ve seen. That’s 120 clicks.

Out of those clicks, you get a 50% opt-ins, that’s 60 people entering their name and email. Out of that 60, you get two sales. Congratulations, you just made $100.

So, you decide to try again, with a different vendor (or the same). You pay the same amount, but this time no one opt-in. Well, now you just lost $100…

So, you buy more ads, but this time you filter for prime traffic, that’s a minimum of $0.10 per click and that’s being generous, most charge up to $0.30 extra. That’s $112 in solo ads.

This time, you get 100 people to enter their name and email address and you make, once again being generous, 5 sales. That’s $250. Basically, you’ve made back the $100 you spent before and the $112 you just spent to have $38 in profits.

Guess what, you still lost a combined at least of $800. Where did I get that number from?

Simple, by using a funnel and an email address, you can retarget the, what 160 people who opted in and make additional sales over time. And I’m low-balling this giving you 16 sales total when it’s possible you could make much more through automated email sequences or chatbots.

8 Tips for Writing LinkedIn Blog Posts That Expand Your Influence

posted on June 6, 2021



LinkedIn is a great professional network, assuming you know how to use it right. It doesn’t matter if you are a freelancer or are looking for a permanent office position -either way, you can make your profile more impressive as well as build your influence with the help of LinkedIn.

How can you do that? By writing interesting and compelling blog posts that would be valuable to your audience. Here are eight tips to help you do just that.

1. Know the message you want to deliver.

Before you start writing, you need to know what you plan to achieve with the help of your blog posts. Maybe you want to become a niche expert, to grow your network, to share your experience with the others, or to build up your professional value.

After you decide, focus on the message that would help you achieve your desired result. Maybe you could give valuable financing tips, or create amazing infographics to illustrate your posts or give some insights on working for a certain company, etc. Make sure that your content is focused on this type of message; this way you’ll become well-known for something specific and attract the attention of the people that you’re interested in.

2. Keep your network in mind.

Maybe you already have a network you want to expand, or maybe you want to build one. Either way, you should keep this network in mind while coming up with the ideas of a blog post.

Ask yourself: who are the people you want to attract and what things interest them? The answers could differ a lot, depending on the niche you’re working in. However, it’s important to stick to the related topics and to cover issues that are of current interest to your network.

3. Cover the trending topics.

Just as it’s important to the stick to the topics that seem interesting to your audience, it’s important to cover the latest trends in your niche. The reason behind this is very simple: it’s the latest trends that go viral and drive more traffic.

LinkedIn’s editors themselves look for posts that cover the latest trends — and if the posts are good, they promote them via Pulse channels, driving even more traffic to your page.

4. Don’t make them too long.

When it comes to ordinary blog posts, the principle ‘the longer, the better’ often works well (when the posts are informative and well-written, of course). However, when it comes to LinkedIn posts, you should stick to short ones (300-800 words) instead of long ones.

LinkedIn is a professional network mostly — and not all professionals have enough time to read long articles. That’s why, when it comes to LinkedIn promotion, it’s better to craft short and very informative articles instead of long and unclear ones.

5. Add a bit of personal information.

While your goal is to deliver informative content, this doesn’t mean that you have to strip it of your personality completely. After all, people can get similar advice in other places — it’s the personality that often makes them stay and continue reading the posts of a specific person.

Sharing something personal doesn’t mean telling the readers all about your life. Even adding some personal examples and coming up with your own advice counts as personal experience the readers would be interested in. For example, if you’re writing about essay topics, share some stories on how you came up with such topics during college years.

6. Polish your posts.

Once you are finished writing, found (or created) an image to illustrate your article, and crafted a proper headline, you can move on to polishing your article.

Of course, there are plenty of online services that can help you with the proofreading part, but polishing is not only about that. Proofreading is a must, yes, but it’s also important to check the formatting, to ensure that all links are clickable and lead to the right sites, that all the tags you want to use are included and all the images open.

Once that is done, proofread it manually one more time, paying special attention to grammar and spelling. While online proofreaders are good for an initial check, they still aren’t as effective as a manual check.

7. Don’t forget to be social.

Promotion is just as important as writing high-quality posts. However, promotion is not only sharing your post on other social media, it’s also about making the most of LinkedIn itself. Remember that it’s a social network — you can drive more traffic to your page simply by talking with people there.

Don’t settle for responding to comments only. Comment on other articles as well. Join some groups that are related to your niche and join the discussions there. This could help you become noticed by others, build your authority, and gain more followers.

8. Know your limits.

When it comes to any blog writing, consistency is definitely the key. You should post regularly, preferably at the same time to build an audience and to look professional. However, what exactly does ‘regularly’ mean? Should you do this every day? Twice a week? Once every two weeks?

It’s always better to start with small goals. For example, you can try posting once a month for a start, making sure that your posts are indeed valuable, informative, and fresh. Once you do this for a few months, consider increasing the frequency to twice a month. Once this becomes easy, you can move on to posting once a week if you have the time to do so.

Remember that quality is always better than the quantity and while the frequency is important, not overdoing it is even more important. Not everyone is able to dedicate themselves to the blogging only (at least, when they start). Most of us often need to combine writing with a full-time job. So when you decide to take it slow, you’ll ensure that you are able to produce high-quality content and monitor the results effectively without straining yourself too much.

How do I create a profile backlink?

posted on June 6, 2021


This is not as difficult as it seems, but specific expertise is mandatory for a high-quality and organic backlinks placement. Many people try to study a couple of articles about SEO and backlinks in a couple of days and start doing this, and after that, they are surprised that they are failing... It isn't wise to expect in this case that you will succeed.

I analyzed and studied a lot of all these processes to achieve success and get the desired result.
In addition to personal experience, I would like to provide a few excerpts from this source: Strong and Natural Backlink Profile: How Does it Look Like? - Crowdo Blog. Well, let's talk directly about how to create a profile backlink. First, in modern SEO, there is no "perfect" backlink profile as it differs from case to case. However, specific rules classify a healthy and natural backlink profile that can push your search results.

First, it is essential to use some tools to make it easier to track your backlink's health and performance. I'm talking about services like Keyword. You can even use Google's original keyword tracking service. I also know a few excellent backlink trackers, and they work fantastic. Second, you need to review and classify domains very carefully. Moreover, a healthy backlink profile will consist of both dofollow and nofollow (preferably around 70% DF and 30% NF). If all of your backlinks are dofollow, this may seem unnatural to the search engine.

Don't forget about Anchor Text - one of Google's factors when checking your backlinks' quality. More information about this can be found in that backlink profile article. If what I said seems complicated to you, then there is a video on YouTube that perfectly explains all this Even a beginner will understand what is presented in the video, but it will also be useful for experts. Finally, I would like to remind you of one crucial SEO moral: "There are worse crimes than breaking the rules of search engines. For example, not reading them."
Remember and consider all these rules, and you will see progress in the process of your optimization.

Can you make good money as an Amazon affiliate?

posted on June 6, 2021


Best Ways Make Money As an Amazon Affiliate, The Guru Is Going To Hate Me For This

Ask anybody to advise an online store, and probabilities are they may say Amazon. Amazon is one of the maxima relied on manufacturers on the Internet for on-line purchasing. Amazon is also one of the first to offer an affiliate advertising and marketing software. They noticed the price of creating possibilities for humans to work at home selling Amazon products. You can construct a business that agrees with and make cash as an Amazon affiliate.

What You Need

Before you join up as an Amazon associate, you need some sort of platform to sell merchandise. The most common manner to promote any associate products is through websites and blogs. Although there are numerous free weblog web sites inclusive of Weebly and Google's Blogger, a paid weblog or internet site appears more expert and could earn extra trust. You additionally want to set up a PayPal account so that your commissions can be sent to you.

Getting Started

Becoming an Amazon affiliate is an issue of going to their website online and signing up. Amazon calls its associate software "Amazon Associates", and the signup hyperlink is on the bottom of their page. Once you undergo the sign-up technique you are given an associate's identity. Be positive to study via all of their web pages to completely apprehend the way to region your particular identity for your links so that you get credit for sales and get paid.


When choosing products to promote as an Amazon affiliate, look for products that hobby you. One of the great techniques of promoting a product is by way of writing opinions or articles about the product. Remember that as an affiliate marketer, you are selling a product to a client without them seeing the product first hand. You need to create a purpose for them to buy the product. Amazon has many tools in the affiliate center to help you promote merchandise. The equipment they have got consists of product hyperlinks, banners, or even an opportunity to build an internet keep to vicinity on your website.

Earning Potential

As an Amazon affiliate, your income capability is unlimited. Unlike promoting an unmarried product or ebook, you have got a possibility to earn a fee on any object a client purchases at Amazon through your precise hyperlink. For example, let's consider which you have a web web site or weblog about cooking. You vicinity a few links to recipe books on your site. A reader at your site clicks on one of these links and they get taken to Amazon via your particular hyperlink. The patron buys the book, however, they also might also purchase a cooking device or a few other objects inside Amazon. Instead of just getting a fee on the e-book sale, you get a commission on the complete sale.

Amazon is the various exceptional in affiliate marketing applications and is a possible alternative for growing amazing profits. You can see just how fast you can come to be an Amazon associate and make cash at domestic promoting Amazon merchandise. Go to Amazon and take a look at the affiliate application for yourself.

Are Squidoo And HubPages Still Worth It?

posted on June 5, 2021



The content world has been monitoring the steady decline in quality of Squidoo and HubPages since 2012. Back then, the query revolved around these two, and all the rage was which one was the better choice. But today, both platforms are giving every indication of going out the door and off the SEO table. But is this really the case? Can platforms like Squidoo and HubPages still be useful, and if so, how?

The Squidoo vs. HubPages Debate

Squidoo is a “community website platform.” Created in 2005, it allows its users to create pages (referred to as “lenses”) to sell products for profit or charity. As of October 2010, approximately 1.5 million “lenses” were in existence.

Wikipedia defines HubPages as a “user generated content [and] revenue-sharing website.” It was initially launched in 2006. As of December 2013, the site encompassed approximately 910,106 “hubs,” which are magazine-style articles covering specific topics that are user created and published. Nearly 74,000 users and 2.5 million forum posts were recorded in 2013.

According to GreekGeek, the debate surrounding which platform offers more value or benefit boils down to a battle of impressions versus interaction. Since 2007, it’s been a fairly hot topic — numerous users of both platforms attempted to compare notes, run content on both sites, and discover which of the two held an advantage and why. At the end of the day, HubPages seemed to be more about content while Squidoo felt more like a sales pitch. Comparing the two was difficult. In 2014, the scene has changed notably.

In a recent report by Matt Southern of SearchEngineJournal, the announcement was made that Squidoo is moving its content over to HubPages. Why the move? HubPages has successfully acquired Seth Godin’s content platform.

Both platforms support content publishing, and they’ve both been labeled as “Web 2.0” sites. The unfortunate truth is that such sites are prone to abuse. As a result, both platforms have had unpleasant run-ins with Google’s Panda in the past.

Southern reports that, according to Godin, the acquisition will lead to “a stronger, more efficient, [and] more generous way to share great stuff online.” Over the coming weeks, traffic to Squidoo will be redirected to relevant HubPages, and transferring content between the platforms is said to be easy and primarily automatic. Squidoo pages are projected to no long be accessible by early this month.

The Controversial Acquisition

HubPages’ acquisition and subsequent transfer of content has sparked some controversy. Barry Schwartz of Search Engine Land poses the question of if it’s simply a transfer of content from one Google Panda victim to the next? After all, both platforms were hit hard by the Panda algorithm back in 2011. And although Squidoo is reportedly moving only “the best” of its content, the question of if HubPages is worthy of use is still relevant.

The Facts about HubPages

Just last year a SlideShare review of HubPages was posted. The review tackled two of the most relevant questions:

1. Does the platform provide the opportunity to earn an income online?
2. Should you use HubPages, Squidoo, or build your own website?
HubPages is built around the idea that content is part of marketing, which at face value is a solid plan. After all, content is the backbone of SEO these days. However, how you craft and implement that content is just as important as the who, what, when, where, and why of your copy. According to the SlideShare, the five steps to successfully leveraging the HubPages platform include:

1. Starting with a plan which, basically, is your niche.
2. Writing high-quality articles of approximately 1,000 words each.
3. Inserting some pictures and videos.
4. Picking the right keywords all of the time.
5. Publishing hubs in the same niche and inter-linking them.

Keep in mind that this platform is NOT for clickbank affiliate marketing, competitive markets, copied (let’s just call it what it is: plagiarized) content, or spun articles.
A quick review of HubPages official website can give you a fairly good picture of the advantages and disadvantages to leveraging this content platform. Here’s what our trained eye noted:

• Advantage 1: It’s free. People are attracted to this platform because it’s free. You can publish your content and engage in a bit of advertising (so long as it follows the rules) at no out of pocket cost. Moreover, you don’t have to spend money to make money (apparently).

• Advantage 2: It presents an opportunity to earn. Platforms like HubPages (and previously Squidoo) advertise the opportunity to use their site as a chance or opportunity to earn money. It doesn’t matter what your purpose is, it just matters that you join. What they don’t tell you is that you’ll have to work very hard to make money via the “opportunity” they’re handing you.

• Advantage 3: It’s easy to manage. Setting up and managing a hub is easy. In fact, it’s so easy that they say anybody can do it. They advertise themselves as easily accessible by everyone, and no skills or understanding of anything is needed.

• Advantage 4: It’s easy to register. The registration process isn’t complex, and it will not take a lot of your time. You’ll be up and running in no time flat.

• Disadvantage 1: Earnings are percent based. One of the most talked about cons of this platform is the percent based earning system, which (at last report) is set at a 60/40 split. You will only receive 60 percent of the amount you acquire via online moneymaking methods. It sounds a lot like affiliate marketing, doesn’t it? Let’s put this disadvantage into perspective. Say you had expected earnings of $1,500. After the split, you would receive $900 and lose $600 to HubPages. That’s a sizeable chunk of change lost to a “free” opportunity to make money, and that 40 percent chunk is going to come out every single time.

• Disadvantage 2: There’s no assurance. The primary use of HubPages is to publish written articles of about 1,000 words each. With that in mind, how long does it take you to write a 1,000-word piece? If you can’t write well and decide to hire a writer, you’re looking at a new expense to subtract from the $900 profit you’re set to earn. A quality writer, one that will help and not harm your reputation, isn’t going to come cheap. And to top off the conundrum you’re staring down the barrel of, HubPages doesn’t offer any type of assurance. What does that mean? It means there is absolutely no guarantee that the site will be up for very long. Imagine investing the time in writing your content or PAYING someone to write it for you, and suddenly, out of nowhere, it’s all gone! It’s a horrifying thought, isn’t it?

• Disadvantage 3: Advertising opportunities are limited. Your ability to advertise is limited, and you are bound by the rules set forth by the platform. There’s no true ability to think outside of the box and engage in tailored advertising campaigns. In short, it’s not a platform that lends itself to growth.

• Disadvantage 4: It takes time. Managing your page will take a solid time investment. Set-up is the easy part. Creating and publishing those articles and engaging in the limited advertising opportunities will require a substantial commitment of time—more so than usual if only to recoup the 40 percent loss in profit by increasing your overall earnings. Then again, can you really recoup when you’re putting in double or triple the effort and still bound by that 60/40 percentage split? It just doesn’t make good business sense, at least not for the serious minded businessperson.

The majority of the information about this platform’s unsavory run-in with the Google Panda is dated prior to the 4.0 update. Interestingly, we can’t find any credible information reflecting any positive run-ins since the update. We can only speculate that since the platform is a figurate breeding ground for thin content, it likely hasn’t improved dramatically. However, the update is still relatively recent and improvements may not be seen for a few months.

Our Conclusion: You could consider HubPages to be a content publishing platform with the perks of an online community, but the overall gist reads an awful lot like affiliate marketing jargon. If you’re reading this, we’re betting you’re interested in establishing and/or growing a strong online presence to promote your business and generate leads. Your ultimate goal is to create conversion. Instead of investing your precious time into a platform that’s had unpleasant run-ins with Google, why not invest more wisely? And this leads us to our hard-hitting recommendation:

Don’t Contribute to the Crappy Content Flood

Thanks to the “dollar days” of online content, we’re smack dab in the middle of a quality content drought. Your audience, aka that group of people made up of prospective buyers, is dying of thirst for high-quality copy. That’s why Google has pushed so hard to ensure that websites comprised of useful, highly relevant, and top-notch quality content are the first results search users see.

Unfortunately, the track record shows that platforms like Squidoo and HubPages have contributed to the drought. Moz’s Q&A forum holds a strong example of exactly why you shouldn’t waste your time with platforms like these. Marko poses his situation and asks if he should choose Squidoo or HubPages:

Three out of the four responders all agree that creating and attempting to construct content on crappy free for all’s like Squidoo or HubPages just isn’t the way to go. Instead, they encourage Marko to concentrate on creating informative and engaging content, the kind that’s useful to the customer. One responder even urges him not to feel “afraid of [linking] to outside sources.”

What’s the point? A “quick fix” publishing platform cannot replace an expertly crafted, high-quality, Panda-pleasing website stocked with strong content. Platforms like HubPages are generally not frequented by serious individuals interested in building a credible reputation.

If your goal is to grow your brand or business and earn credibility and authority in your industry, platforms like these are NOT the way to go, if only for the reason of HubPages Advantage 3 (anybody with no skills or understanding of anything can—and will—join). And let’s not even talk about how the lack of assurance encourages crappy content that the creator won’t care about should it disappear into the cyber version of the Bermuda Triangle.

Where Is The Value?

The recent announcement of HubPages acquiring Squidoo has certainly raised some eyebrows throughout the top resource sites for SEO education. Both Matt Southern of SearchEngineJournal and Barry Schwartz of Search Engine Land agree that the news is most interesting. Who knows? The acquisition and resulting merge just might result in a content publishing platform that’s more Google-friendly. But until that’s proven, it’s probably best to keep a safe distance and watch what happens.

Are Squidoo and HubPages worth it? Honestly? No. They’re not worth your time or attention. Your valuable time could be much better invested in learning how to create your own Panda, Penguin, and Hummingbird friendly online presence. A presence that you will have the ultimate amount of control over; a presence that will only be limited by the boundaries of your creativity and passion.

Believe it or not, learning how to make it all happen isn’t rocket science. The fact that you will go the extra mile to learn, that alone will give you an edge over every single HubPage and soon to be disappearing Squidoo on the Internet, and THAT is well worth your time and every last bit of investment.

Choosing between MySQL vs PostgreSQL vs SQL Server

posted on June 5, 2021


The choice between SQL and non-SQL databases usually boils down to differences in the structure. However, when we are looking into several SQL solutions, the criteria are a lot more distorted. Now will consider the aspects more precisely and analyze the underlying functionality. We’ll be taking a look at the three most popular relational databases: MySQL vs Postgresql vs SQL server.

To help you, we have collected advice from our database developers, re-went through manuals, and even looked up official in-depth guides. We do tend to have our personal preferences, but in this guide, we will put them aside in favor of objective comparison.

stackoverflow questions


MySQL happens to be one of the most popular databases, according to DB Engines Ranking. It’s a definite leader among SQL solutions, used by Google, LinkedIn, Amazon, Netflix, Twitter, and others. MySQL popularity has been growing a lot because teams increasingly prefer open-source solutions instead of commercial ones.

Price: the database solution is developed by Oracle and has additional paid tools; the core functionality can be accessed for free.

Language: MySQL is written in C++; database management is done with Structured Query Language.


Read our comparison of MongoDB vs MySQL to make the right choice of a database solution.


A tried-and-proven relational database that is known for supporting a lot of data types, intuitive storage of schemaless data, and rich functionality. Some developers go even as far as to claim that it’s the most advanced open-source database on the market. We wouldn’t go that far, but it’s definitely a highly universal solution.

Price: open-source

Language: C

SQL Server

Unlike Postgresql vs MySQL, SQL Server is a commercial solution. It’s preferred by companies who are dealing with large traffic workloads on a regular basis. It’s also considered to be one of the most compatible systems with Windows services.

The SQL Server infrastructure includes a lot of additional tools, like reporting services, integration systems, and analytics. For companies that manage multiple teams, these tools make a big difference in day-to-day work.

Price: the database has a free edition for developers and small businesses but only supports 1 processor, 1GB of maximum memory used by the database engine and 10GB maximum database size.

. For a server, users need to pay $931.

Side-by-side Comparison of SQL Tools

In this comparison, we’ll take a look at the functionality of the three most popular SQL databases, examine their use cases, respective advantages, and disadvantages. Firstly, we’ll start by exploring the in-depth functionality.

Data Changes

Here we evaluate the ease that the data can be modified with and the database defragmented. The key priority is the systems’ flexibility, security, and usability.

Row updates

This criterion refers to the algorithms that a database uses to update its contents, speed, and efficiency.

In the MySQL case, a solution updates data automatically to the rollback storage. If something goes wrong, developers can always go back to the previous version.

PostgreSQL: developers insert a new column and row in order to update the database. All updated rows have unique IDs. This multiplies the number of columns and rows and increases the size of the database, but in turn, developers benefit from higher readability.

SQL Server: the database has three engines that are responsible for row updates. The ROW Store handles the information on all previous row updates, IDs, and modified content. The in-memory engine allows analyzing the quality of an updated database with a garbage collector. The column-store database lets store updates in columns, like in column-driven databases.

sql server

Among these three, SQL Server offers perhaps the most flexibility and efficiency, because it allows monitoring updated rows and columns, collecting errors, and automating the process. The difference between SQL Server and MySQL and Postgresql lies mainly in customizing the positions – SQL Server offers a lot more than others.


When developers update different parts of an SQL database, the changes occur at different points of the systems and can be hard to read, track, and manage. Therefore, maintenance should include defragmentation – the process of unifying the updated database by assigning indexes, revisiting the structure, and creating new pages. The database frees up the disk space that is not used properly so that a database can run faster.

MySQL offers several approaches to defragmentation – during backup, index creation, and with an OPTIMIZE Table command. Without going into much detail, we’ll just say that having that many options for table maintenance is convenient for developers, and it surely saves a lot of time.

PostgreSQL allows scanning the entire tables of a data layer to find empty rows and delete the unnecessary elements. By doing so, the system frees up the disk space. However, the method requires a lot of CPU and can affect the application’s performance.


SQL Server offers an efficient garbage collector that doesn’t create more than 15-20% of overhead. Technically, developers can even run garbage collector on a continuous basis, because it’s that efficient.

Overall, MySQL and SQL Server offer more of defragmentation methods that Postgresql does. They consume less CPU and provide more flexible settings.

Data Queries

Here, we take a look at how the systems cache and process user requests, what approaches they take in storing data, and how developers can manage it.

Buffer Pool

Some systems call a buffer to pull cache, but regardless of terminology, our goal is to summarize the algorithms that systems use to process user queries and maintain connections.

MySQL offers a scalable buffer pool – developers can set up the size of the cache according to the workload. If the goal is to save CPU and storage space, developers can put strict benchmarks on their buffer pool. Moreover, MySQL allows dividing cache by segments to store different data types and maximize isolation.

PostgreSQL isolates processes even further than MySQL by treating them as a separate OS process. Each database has a separate memory and runs its own process. On the one hand, management and monitoring become a lot easier, but on the other, scaling multiple databases takes a lot of time and computing resources.

SQL Server also uses a buffer pool, and just like in MySQL, it can be limited or increased according to processing needs. All the work is done in a single pool, with no multiple pages, like in Postgresql.

If your priority is to save computing resources and storage, choose flexible solutions: the choice will be between MySQL vs SQL Server. However, if you prefer clear organization and long-term order, Postgre, with its isolated approach, might be a better fit.

Temporary Tables

Temporary tables allow storing intermediate results from complex procedures and branched business logic. If you need some information only to power the next process, it doesn’t make sense to store it in a regular table. Temporary tables improve database performance and organization by separating intermediary data from the essential information.


MySQL offers limited functionality for temporary tables. Developers cannot set variables or create global templates. The software even limits the number of times that a temporary table can be referred to – not more than once.

Postgresql offers a lot more functionality when it comes to temporary content. You divide temporary tables into local and global and configure them with flexible variables.

SQL Server also offers rich functionality for temporary table management. You can create local and global temporary tables, as well as oversee and create variables.

Temporary tables are essential for applications with complicated business logic. If your software runs a lot of complex processes, you will need to store multiple intermediary results. Having rich customization functionality will often be necessary throughout the development process.


The way a database handles indexes is essential because they are used to locate data without searching for a particular row. Indexes can refer to multiple rows and columns. You can assign the same index to files, located in the different places in the database, and collect all these pieces with a single search.

tables indexes

In this comparison, we evaluated the way indexes are created in every solution, the support of multiple-index searches, and multi-column indexes, as well as partial ones.

MySQL organized indexes in tables and clusters. Developers can automatically locate and update indexes in their databases. The search isn’t highly flexible – you can’t search for multiple indexes in a single query. MySQL supports multi-column indexes, allowing adding up to 16 columns.

Postgresql also supports index-based table organization, but the early versions don’t include automated index updates (which appear only after the 11th edition release). The solution also allows looking up many indexes in a single search, which means that you can find a lot of information. The multi-column settings are also more flexible than in MySQL – developers can include up to 32 columns.

SQL Server offers rich automated functionality for index management. They can organize in clusters and maintain the correct row order without manual involvement. The solution also supports multiple-index searches and partial indexes.

Having flexible index settings allows looking up information faster and organizing multiple data simultaneously.

Memory-Optimized Tables

Memory-optimized tables are mainly known as a SQL Server concept, but they also exist in other database management solutions. Such a table is stored in active memory and on the disk space in a simplified way. To increase the transaction speed, the application can simply access data directly on the disk, without blocking concurrent transactions. For processes that happen on a regular basis and usually require a lot of time, a memory-optimized table can be a solution to improve database performance.

Memory-optimized tables

MySQL supports the memory-stored table, but it can’t participate in transactions, and its security is highly vulnerable. Such tables are used only for reading purposes and can simplify exclusively primitive operations. For now, MySQL doesn’t come close to making the most out of memory-optimized tables.

PostgreSQL doesn’t support in-memory database creation.

SQL Server uses an optimistic strategy to handle memory-optimized tables, which means they can participate in transactions along with ordinary tables. Memory-based transactions are faster than regular ones, and this allows a drastic increase in application speed.

As expected, memory-optimized tables are best set up in MySQL – it’s basically their native approach. It’s not an essential database feature, but still, a good way to improve performance.

JSON Support

The use of JSON files allows developers to store non-numeric data and achieve faster performance. JSON documents don’t have to be parsed, which contributes to much higher processing speed. They are easily readable and accessible, which is why JSON support simplifies maintenance. JSON files are mostly used in non-relational databases, but lately, SQL solutions have supported this format as well.

MySQL supports JSON files but doesn’t allow indexing them. Overall, the functionality for JSON files in MySQL is very limited, and developers mostly prefer using classical strings. Similarly to non-relational databases, MySQL also allows working with geospatial data, although handling it isn’t quite as intuitive.

Postgresql supports JSON files, as well as their indexing and partial updates. The database supports even more additional data than MySQL. Users can upload user-defined types, geospatial data, create multi-dimensional arrays, and a lot more.

SQL Server also provides full support of JSON documents, their updates, functionality, and maintenance. It has a lot of additional features for GPS data, user-defined types, hierarchical information, etc.

Overall, all three solutions are pretty universal and offer a lot of functionality for non-standard data types. MySQL, however, puts multiple limitations for JSON files, but other than that, it’s highly compatible with advanced data.

Replication and Sharding

When the application grows, a single server can no longer accommodate all the workload. Navigating single storage becomes complicated, and developers prefer to migrate to different ones or, at least, create partitions. The process of partitioning is the creation of many compartments for data in the single process.



Replacing is easier in NoSQL databases because they support horizontal scaling rather than vertical – increasing the number of locations rather than the size of a single one. Still, it’s possible to distribute data among different compartments even in SQL solutions, even if it’s slightly less efficient.

MySQL allows partitioning databases with hashing functions in order to distribute data among several nodes. Developers can generate a specific partition key that will define the data location. Hashing permits avoiding bottlenecks and simplifying maintenance.

Postgresql allows making LIST and RANGE partitions where the index of a partition is created manually. Developers need to identify children and parent column before assigning a partition for them.

SQL Server also provides access to RANGE partitioning, where the partition is assigned to all values that fall into a particular range. If the data lies within the threshold, it will be moved to the partition.


The database ecosystem is important because it defines the frequency of updates, the availability of learning resources, the demand on the market, and the tool’s long-term legacy.

MySQL Ecosystem

MySQL Ecosystem

MySQL is a part of the Oracle ecosystem. It’s the biggest SQL database on the market with a large open-source community. Developers can either purchase commercial add-ons, developed by the Oracle team or use freeware installations. You will easily find tools for database management, monitoring, optimization, and learning. The database itself is easy to install – all you have to do is pretty much download the installer.

MySQL has been a reliable database solution for 25 years, and statistics don’t pinpoint at any sights of its decline. It looks like MySQL will keep holding a leading position not only among SQL tools but also among all the databases in general.

Postgresql Ecosystem

Postgresql Ecosystem

The Postgresql community offers a lot of tools for software scaling and optimization. You can find add-ons by your industry – take a look at the full list on the official page. The integrations allow developers to perform clustering, integrating AI, collaborating, tracking issues, improving object mapping, and cover many other essential features.

Some developers point out that Postgresql’s installation process is slightly complicated – you can take a look at its official tutorial. Unlike MySQL, which can run right away, Postgresql requires additional installations.

SQL Server Ecosystem

SQL Server Ecosystem

SQL Server is highly compatible with Windows and all Microsoft OS and tools. If you are working with Windows, SQL server is definitely the best option on the market. Users of the database receive access to many additional instruments that cover server monitoring (Navicat Monitor), data analysis, parsing (SQL Parser), and safety management software (DBHawk).

SQL Server ecosystem is oriented towards large infrastructures. It’s more expensive than open-source competitors, but at the end of the day, users get access to frequently updated official ecosystem and active customer support.

What is the difference between SQL and MySQL? MySQL is an open-source database, whereas SQL Server is a commercial one. MySQL is more popular, but SQL Server comes close.


For a start, we analyzed the DB Engines ratings of every compared engine. The leader is MySQL, with second place as the most popular database and second most popular relational solution. SQL Server takes third place, while PostgreSQL is ranked fourth.

The statistics by Statista shows the same tendency. MySQL is ranked second, leaving the leading position to Oracle, the most popular DBMS today. SQL Server follows with a slim difference, whereas Postgresql, which comes right after, is a lot less recognized.

databases popularity

MySQL, therefore, is the most demanded database on the market, which means finding competent teams, learning resources, reusable libraries, and ready add-ons will be easy. So, if you are choosing between SQL Server vs MySQL in terms of market trends, the latter is a better choice.

Companies using MySQL

  • Google
  • Udemy
  • Netflix
  • Airbnb
  • Amazon
  • Pinterest

MySQL is used widely by big corporations and governmental organizations. Over the last 25 years, the solution has built a reputation of a reliable database management solution, and as time shows, it’s indeed capable of supporting long-running projects.

Companies that use PostgreSQL

  • Apple
  • Skype
  • Cisco
  • Etsy

Postgre is known for its intuitive functionality and versatile security settings. This is why its main use cases are governmental platforms, messenger applications, video chats, and e-commerce platforms.

Companies using SQL Server

  • JPMorganChase
  • Bank of America
  • UPS
  • Houston Methodist

SQL Server is a go-to choice for large enterprises that have vast business logic and handle multiple applications simultaneously. Teams that prioritize efficiency and reliability over scalability and costs typically choose this database. It’s a common option for “traditional” industries – finances, security, manufacturing, and others.

MySQL vs PostgreSQL vs SQL Server infographic


The choice between the three most popular databases ultimately boils down to the comparison of the functionality, use cases, and ecosystems. Companies that prioritize flexibility, cost-efficiency, and innovation usually choose open-source solutions. They can be integrated with multiple free add-ons, have active user communities, and are continuously updated.

For corporations that prefer traditional commercial solutions, software like SQL Server backed up by a big corporation and compatible with an extensive infrastructure, is a better bet. They have access to constant technical support, personalized assistance, and professional management tools.

If you are considering a database for your project, getting a team of experts who will help you define the criteria and narrow down the options is probably the best idea. You can always get in touch with our database developers – we will create a tech stack for your product and share our development experience.

Publishing Content Online Tutorial

posted on June 5, 2021


You now know the key to an unforgettable and successful marketing campaign starts with brilliant content. Yet 74% of marketers focus solely on the content itself, not the actual publication. If you’re adopting an “if I write it, they will come” philosophy, you will undoubtedly experience failure. Creating relevant, educational, and unique content for your audience is only the first step. The second step – and in many ways the most crucial – is getting published.

Earlier in 2014, Google’s Matt Cutts essentially ‘spelled the end of guest blogging.’ ( That was bad news to anyone not on the content marketing train, but most savvy marketers knew by then that creating their own content and publishing it throughout the web is the secret to customer acquisition. This was only another nail in the coffin regarding the old ways of generating leads.

Content alone is not the only component to successful content marketing. There is a trifecta that must be mastered by companies if they want to play big. The three prongs of content marketing are:

1) Content –
unique, compelling, and timely.

2) Audience – you absolutely must have a clear focus on who you are marketing to, based on research that shows your who your quintessential customer is.

3) Timing – content must be relevant, never outdated, and important to your demographic as it is published.

Step 1: Finding Your Voice

Before you publish a single blog or video, it’s essential that your company spend time establishing a clear voice and identity. If you’re all over the map with your marketing, it will show. Consumers will become confused by your messaging, and loyalty will be incredibly hard to come by. Furthermore, it’s bloody hard to draft content when you’re unclear about what voice to use.

As you clarify your brand’s identity, find a voice that is authoritative but not off-putting. Your content creators must have the ability to write with conviction and confidence, but in a way that relates to your readers. This is by no means an easy balance, and it’s the reason why hiring cheap freelancers that don’t know your business can cause more harm than good.

Keep in mind, too, that the best way to engender confidence is to speak the language of your customers. Use the art of surveys and polls to learn more about your demographic, and log metrics religiously so when you draft your content, you know exactly who you’re speaking to. Case studies are also invaluable assets to your marketing team; by understanding your previous successes and failures, you continue to narrow in on the exact sweet spot your company has carved out. That’s precisely who you should create content for.

What does this have to do with publishing your content? Absolutely everything. If you don’t have a clue about who your ideal customer is, you don’t know what publications to seek out. And if your content isn’t created with an air of authority and expertise, your target publications will no doubt pass you by. Step one is always about establishing your thought leadership prowess; otherwise, you are likely to get lacklustre (at best) results.

Avoid Duplicate Content

If you want to be published in any reputable publication, you need to have something unique to say. In this age of content clutter, that is no small feat. Need some inspiration? Here are some tips for creating intriguing and original content:

1) Take a New Perspective

Finding a brand new topic in your industry and area of expertise may seem impossible. But what about taking a different position on a commonly discussed theme? If everyone is talking about the features of a given piece of software, as an example, talk about the target user. Or the inadequacies. Or perhaps the competitive landscape. You don’t have to reinvent the wheel, but you do have to find a different way to talk about what the wheel does.

2) Expand Your Thought Horizons

Most content these days has a somewhat narrow focus, discussing how things are impacting the present moment. Don’t be afraid to expand into the historical or future implications of any given product or movement. If you’re a business that’s been around a while, let your content tell the story of your evolution, the snafus you encountered, the lessons you learned, and everything in between. Examine details that haven’t yet been discussed, or expound on future plans; either on a minute or a broad scale. The point here is to encourage you to think beyond the obvious, and to create thought leadership around the topics you know best.

3) Interview the Experts

Find people that know your world well and feature interviews and quotes from these top-tier contributors. Interviews are still very popular in almost all industries, and they’re a great way to generate compelling content and inspiration for future pieces as well. Plus, you’ll gain loyalty from those you highlight, as well as access to their audiences. Featuring influencers is always a win-win.

Once you master the process of speaking to your audience in a commanding way, and you know exactly how to create pieces that fill in the information gap, you are absolutely gold to publishers. Next, you’ll want to spend significant time finding the online magazines, blogs, and sites that cater to your audience. Don’t just rely on top tier destinations like Forbes; branch out and find niche sites that are magnets for your demographic. To find these, comb through articles AND comments to find what sites are most engaging amongst those that feature content from your industry. Analyze social sharing metrics, and start relationship building with these publishers so your amazing content finds its ideal audience.

Getting published starts with knowing your audience, creating relevant content, and then creating bonds with the folks that need your creative genius. Publishers need great content, and content creators need successful publishers. If you focus on these two aspects of content marketing, you just might see results that exceed your wildest dreams.

Which SEO tools do you suggest using daily?

posted on June 4, 2021


Top 4 SEO Tools Every Marketer Should Use in 2021

Modern Search Engine Optimisation (SEO) practices dictate that a combination of tools must be used to ensure agency clients perform strongly across search engines. In Australia, the market is dominated by Google meaning the majority of efforts are around optimising for Google search. At Sharp Instincts the opening questions we often get asked when meeting new clients is “How important is rankings in driving business performance?” and “Do you really need SEO tools to measure performance?” The answer to the first question is that rankings are exceptionally important when looking at overall business performance. As a result of this, the need to use a variety of tools to measure performance is critical. Whether you are analysing traffic, searching for the optimal keywords, trying to curate excellent content, creating client reports, tracking keywords rankings or determining your website traffic funnels there are a variety of tools to help your SEO optimization effective, accurate & make us do it quickly. In this blog, we provide an overview of the 4 best SEO tools we recommend to use in your daily SEO practices:

1. MOZ:

2. Ahrefs:

3. SEMRush:

4. Ubersuggest:

What is the future of artificial intelligence?

posted on June 2, 2021


I worked for a decade at NVIDIA, as a CUDA Devtech Engineer and Solution Architect to research deep learning techniques, and present solutions to customers to solve their problems and to help implement those solutions. Now, for the past 3 years I have been working on what comes next after DNNs and Deep Learning. I will cover both, showing how it is very difficult to scale DNNs to AGI, and what a better approach would be.

Here is a quick graph of Deep Learning and AGI, and how I am estimating they might grow:

What we usually think of as Artificial Intelligence (AI) today, when we see human-like robots and holograms in our fiction, talking and acting like real people and having human-level or even superhuman intelligence and capabilities, is actually called Artificial General Intelligence (AGI), and it does NOT exist anywhere on earth yet. What we actually have for AI today is much simpler and much more narrow Deep Learning (DL) that can only do some very specific tasks better than people. It has fundamental limitations that will not allow it to become AGI, so if that is our goal, we need to innovate and come up with better networks and better methods for shaping them into an artificial intelligence.

Lets look at:

  1. Where Deep Learning and Reinforcement Learning are today
  2. What their limitations are - what can the do and not do?
  3. The neuroscience of human intelligence
  4. A possible architecture to achieve artificial general intelligence

Today’s Deep Learning and Reinforcement Learning

Let me write down some extremely simplistic definitions of what we do have today, and then go on to explain what they are in more detail, where they fall short, and some steps towards creating more fully capable 'AI' with new architectures.

Machine Learning - Fitting functions to data, and using the functions to group it or predict things about future data. (Sorry, greatly oversimplified)

Deep Learning - Fitting functions to data as above, where those functions are layers of nodes that are connected (densely or otherwise) to the nodes before and after them, and the parameters being fitted are the weights of those connections.

Deep Learning is what what usually gets called AI today, but is really just very elaborate pattern recognition and statistical modelling. The most common techniques / algorithms are Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Reinforcement Learning (RL).

Convolutional Neural Networks (CNNs) have a hierarchical structure (which is usually 2D for images), where an image is sampled by (trained) convolution filters into a lower resolution map that represents the value of the convolution operation at each point. In images it goes from high-res pixels, to fine features (edges, circles,….) to coarse features (noses, eyes, lips, … on faces), then to the fully connected layers that can identify what is in the image. The cool part of CNNs is that the convolutional filters are randomly initialized, then when you train the network, you are actually training the convolution filters. For decades, computer vision researchers had hand-crafted filters like this, but could never get results as accurate as CNNs can get. Additionally, the output of a CNN can be an 2D map instead of a single value, giving us a image segmentation. CNNs can also be used on many other types of 1D, 2D and even 3D data.

Recurrent Neural Networks (RNNs) work well for sequential or time series data. Basically each 'neural' node in an RNN is kind of a memory gate, often an LSTM or Long Short Term Memory cell. When these are linked up in layers of a neural net, these cells/nodes also have recurrent connections looping back into themselves and so tend to hold onto information that passes through them, retaining a memory and allowing processing not only of current information, but past information in the network as well. As such, RNNs are good for time sequential operations like language processing or translation, as well as signal processing, Text To Speech, Speech To Text,…and so on.

Reinforcement Learning is a third main ML method, where you train a learning agent to solve a complex problem by simply taking the best actions given a state, with the probability of taking each action at each state defined by a policy. An example is running a maze, where the position of each cell is the ‘state’, the 4 possible directions to move are the actions, and the probability of moving each direction, at each cell (state) forms the policy.

By repeatedly running through the states and possible actions and rewarding the sequence of actions that gave a good result (by increasing the probabilities of those actions in the policy), and penalizing the actions that gave a negative result (by decreasing the probabilities of those actions in the policy). In time you arrive at an optimal policy, which has the highest probability of a successful outcome. Usually while training, you discount the penalties/rewards for actions further back in time.

In our maze example, this means allowing an agent to go through the maze, choosing a direction to move from each cell by using the probabilities in the policy, and when it reaches a dead-end, penalizing the series of choices that got it there by reducing the probability of moving that direction from each cell again. If the agent finds the exit, we go back and reward the choices that got it there by increasing probabilities of moving that direction from each cell. In time the agent learns the fastest way through the maze to the exit, or the optimal policy. Variations of Reinforcement learning are at the core of the AlphaGo AI and the Atari Video Game playing AI.

But all these methods just find a statistical fit - DNNs find a narrow fit of outputs to inputs that does not usually extrapolate outside the training data set. Reinforcement learning finds a pattern that works for the specific problem (as we all did vs 1980s Atari games), but not beyond it. With today's ML and deep learning, the problem is there is no true perception, memory, prediction, cognition, or complex planning involved. There is no actual intelligence in today's AI.

The Neuroscience of Human Intelligence

To go beyond where we are today with AI, to pass the threshold of human intelligence, and create an artificial general intelligence requires an AI to have the ability to see, hear, and experience its environment. It needs to be able to learn that environment, to organize its memory non-locally and store abstract concepts in a distributed architecture so it can model its environment, and people in it. It needs to be able speak conversationally and interact verbally like a human, and be able to understand the experiences, events, and concepts behind the words and sentences of language so it can compose language at a human level. It needs to be able solve all the problems that a human can, using flexible memory recall, analogy, metaphor, imagination, intuition, logic and deduction from sparse information. It needs to be capable at the tasks and jobs humans are and express the results in human language in order to be able to do those tasks and professions as well as or better than a human.

The human brain underwent a very complicated evolution starting 1 billion years ago from the first multi-cellular animals with a couple neurons, through the Cambrian explosion where eyes, ears and other sensory systems, motor systems, and intelligence exploded in an arms race (along with armor, teeth, and claws). Evolution of brains then followed the needs of fish, reptiles, dinosaurs, mammals, and finally up the hominids lineage about 5-10 million years ago.

Much of the older parts of the human brain were evolved for the first billion years of violence and competition, not the last thousands of years of human civilization, so in many ways out brain is maladapted for our modern life in the information age, and not very efficient at many of the tasks we use it for in advanced professions like law, medicine, finance, and administration. A synthetic brain, focused on doing these tasks optimally can probably end up doing them much better, so we do not seek to re-create the biological human brain, but to imbue ours with the core functionality that makes the human brain so flexible, adaptable and powerful, then augment that with CS database and computing capabilities to take it far beyond human.

Because deep learning DNNs are so limited in function and can only train to do narrow tasks with pre-formatted and labelled data, we need better neurons and neural networks with temporal spatial processing and dynamic learning. The human brain is a very sophisticated bio-chemical-electrical computer with around 100 billion neurons and 100 trillion connections (and synapses) between them. I will describe two decades of neuroscience in the next two paragraphs, but here are two good videos about the biological Neuron

and Synapse

from ‘2-Minute Neuroscience’ on YouTube that will also help.

Each neuron takes in spikes of electrical charge from its dendrites and performs a very complicated integration in time and space, resulting in the charge accumulating in the neuron and (once exceeding an action potential) causing the neuron to fire spikes of electricity out along its axon, moving in time and space as that axon branches and re-amplifies the signal, carrying it to thousands of synapses, where it is absorbed by each synapse. This process causes neurotransmitters to be emitted into the synaptic cleft, where they are chemically integrated (with ambient neurochemistry contributing). These neurotransmitters migrate across the cleft to the post-synaptic side, where their accumulation in various receptors eventually cause the post-synaptic side to fire a spike down along the dendrite to the next neuron. When two connected neurons fire sequentially within a certain time, the synapse between them becomes more sensitive or potentiated, and then fires more easily. We call this Hebbian learning, which is constantly occurring as we move around and interact with our environment.

The brain is organized into cortices for processing sensory inputs, motor control, language understanding, speaking, cognition, planning, and logic. Each of these cortices evolved to have networks with very sophisticated space and time signal processing, including feedback loops and bidirectional networks, so visual input is processed into abstractions or ‘thoughts’ by one directional network, and then those thoughts are processed back out to a recreation of the expected visual representation by another, complementary network in the opposite direction, and they are fed back into each other throughout. Miguel Nicolelis is one of the top neuroscientists to measure and study this bidirectionality of the sensory cortices.

For an example, picture a ‘fire truck’ with your eyes closed and you will see the feedback network of your visual cortex at work, allowing you to visualize the ‘thought’ of a fire truck into an image of one. You could probably even draw it if you wanted. Try looking at clouds, and you will see shapes that your brain is feeding back to your vision as thoughts of what to look for and to see. Visualize shapes and objects in a dark room when you are sleepy, and you will be able to make them take form, with your eyes open

These feedback loops not only allow us to selectively focus our senses, but also train our sensory cortices to encode the information from our senses into compact ‘thoughts’ or Engrams that are stored in the hippocampus short term memory. Each sensory cortex has the ability to decode them again and to provide a perceptual filter by comparing what we are seeing to what we expect to see, so our visual cortex can focus on what we are looking for and screen the rest out as we stated in the previous paragraph.

The frontal and pre-frontal cortex are thought to have tighter, more specialized feedback loops that can store state (short-term memory), operate on it, and perform logic and planning at the macroscale. All our cortices (and brain) work together and can learn associatively and store long-term memories by Hebbian learning

, with the hippocampus being a central controller for memory, planning, and prediction.

Human long-term memory is less well known. We do know that it is non-local, as injuries to specific areas of the brain don’t remove specific memories, even a hemispherectomy which removes half the brain. Rather, any given memory appears to be distributed through the brain, stored like a hologram or fractal, spread out over a wide area with thin slices. We know that global injury to the brain, like Alzheimer’s - causes a progressive global loss of all memories, which all degrade together, but no structure in the brain seems to contribute more to this long-term memory loss than another.

However, specific injury to the hippocampus causes the inability to transfer memory between short term and long-term memory. Coincidentally, it also causes the inability to predict and plan and other cognitive deficits, showing that all these processes are similar. This area is the specialty of prominent memory neuroscientist, Eleanor Maguire

, who states that the reason for memory in the brain is not to recall an accurate record of the past, but to predict the future and reconstruct the past from the scenes and events we experienced, using the same stored information and process in the brain that we use to look into the future to predict what will happen, or to plan what to do. Therefore the underlying storage of human memories must be structured in an abstracted representation in such a way that memories can be reconstructed from some for the purpose at hand, be it reconstructing the past, predicting the future, planning, or imagining stories and narratives – all hallmarks of human intelligence.

Replicating all of the brain’s capabilities seems daunting when seen through the tools of deep learning – image recognition, vision, speech, natural language understanding, written composition, solving mazes, playing games, planning, problem solving, creativity, imagination, because deep learning is using single-purpose components that cannot generalize. Each of the DNN/RNN tools is a one-of, a specialization for a specific task, that cannot generalize, and there is no way we can specialize and combine them all to accomplish all these tasks.

But, the human brain is simpler, more elegant, using fewer, more powerful, general-purpose building blocks – the biological neuron, and connecting them by using the instructions of a mere 8000 genes, so nature has, through a billion years of evolution, come up with an elegant and easy to specify architecture for the brain and its neural network structures that is able to solve all the problems we met with during this evolution. We are going to start by just copying as much about the human brain’s functionality as we can, then using evolution to solve the harder design problems.

So now we know more about the human brain, and how the neurons and neural networks in it are completely different from the DNNs that deep learning is using, and how much more sophisticated our simulated neurons, neural networks, cortices and neural networks would have to be to even begin attempting to build something on par with, or superior to the human brain.

Here is a the video about neuroscience and AGI that I submitted to NVIDIA GTC 2021 Conference

How can we build an Artificial General Intelligence?

To build an AGI, we need better neural networks, with more powerful processing in both space and time and the ability to form complex circuits with them, including feedback. We will pick spiking neural networks, which have signals that travel between neurons, gated by synapses.

With these, we can build bidirectional neural network autoencoders that take sensory input data and encode it to compact engrams with the unique input data, keeping the common data in the autoencoder. This allows us to process all the sensory inputs – vision, speech, and many others into consolidated, usable chunks of data called engrams, stored to short-term memory.

Now to store it to long term memory, we process a set of input engrams to reside in a multi-layered, hierarchical, fragmented long-term memory. First we sort the engrams into clusters based on the most important information axis, then autoencode those clusters further with bidirectional networks to create engrams that highlight the next most important information, and so on. At each layer, the bidirectional autoencoder is like a sieve, straining out the common data or features in the cluster, leaving the unique identifying information in each engram, allowing them to then be sorted on the next most important identifying information. Our AI basically divides the world it perceives by distinguishing features, getting more specific as it goes down each level, with the lowest level engram containing the key for how to reconstruct the engram from the features stored in the hierarchy. This leaves it with a differential, non-local, distributed, Hierarchical Fragmented Memory (HFM), containing an abstracted model of the world, similar to how human memory is thought to work.

An example of our encoding process is processing faces. We encode the pictures of faces using the process above. Then we apply alternating layers of autoencoding and clustering to keep sorting those faces and encoding them by implicit features that could be eye color, hair style, hair color, nose shape, and other features (implicitly determined by the layers of autoencoding, and with bins for different classes of features overlapping) – to create a facial recognition system that just by looking at people, autoencodes their face and its features and can assign the associated name that was heard when they were introduced - to that person’s face. Later when we meet a new person, the memory structure and autoencoders are already there to encode them quickly and compactly.

It also encodes language (spoken and written) along with the input information, turning language into a skeleton embedded in the HFM engrams, used to reference the data with, to mold it with, with the HFM give structure and meaning to the language.

When our AI wants to reconstruct a memory (or create a prediction), it works from the bottom up, using language or other keys to select the elements it wants to propagate upwards, re-creating scenes, events, and people, or creating imagined events and people from the fragments by controlling how it traverses upwards. It is this foundation that all of the rest of our design is based on, as once we can re-create past events and imagine new events, we have the ability to predict the future, and plan possible scenarios, doing cognition and problem solving.

We may be able to build a very simple brain that demonstrates these principles, but to scale, we need Charles Darwin - evolutionary genetic algorithms. Basically we define every neuron, synapse, and neural network parameter and how they are organized into layers and cortices and brains - by a genome.

The human brain is represented by only 8000 genes, and decoded by the growth process during fetal developing. We will do the same, because we can’t run genetic algorithms directly on 100 billion neurons, but we can do so on a few thousand genes to run genetic algorithms on much more efficiently, then expand the cross-bred genes to 100 billion neuron brains.

So as we breed generations of ever more sophisticated artificial brains, with more efficient neural networks specialized for specific purposes, we want to steer it into being human-like, or at least able to act and think like a human. For one, we could apply the same cognitive tests we do for children, starting from age 5 and up, to develop them like a human child. Seems logical.

Then, as the AGI starts to grow up - we can pull a trick from the film VFX animation community - do a motion/performance capture of a person, recording their motion, facial expressions and speech, as they go through everyday routines, then setting our artificial brain to train on that dataset, and keep selecting the ones that perform best every generation till we get a human mimic. It will not be AGI, nor human-level intelligence, but it is the best we can do till we make these things have to think more.

To take that all the way to AGI, I would create multiple such AI mimics and put them to work in different professions, writing some specialty code, and evolving specific AIs for each profession, so they have a broad but shallow layer of being conversation bots, but deep skills in their profession.

Now if we have a network of hundreds of different professions, serving millions of clients at once, all with the same brain architecture, with common language and interaction capabilities, how do me make an AGI. Maybe we just network them and that becomes an AGI?

At the very least, we have a framework and input and output on which to train and evolve an AGI, so that all the specialty skills of each vocation are assumed by a more generalized AGI brain, and in the process, that AGI brain becomes better at all the skills humans excel at.

From there, we just keep going till we have a superintelligence, and beyond. Here is the diagram we started with. Deep neural nets don't scale past a certain point because no matter how many more layers of neurons we add, no matter how much labelled data we train with, and no matter how much compute we throw at the problem, the underlying network model is too crude, too approximate, and gains no further cognitive or problem solving capability with these increases. It plateaus for simple problems and is incapable of solving more complex problems.

In our AGI design, we have mapped a very powerful, general purpose, analog spiking neural network computer on top of powerful NVIDIA GPU digital computers, and that flexible design not only scales with compute power, but also scales as those neural networks evolve faster and faster to become larger, broader, and more functional and efficient with time, giving an overall exponential increase in capability greater than the Moore's law increase in the underlying processor power.


How do you get more leads on affiliate links from ClickBank?

posted on June 1, 2021


There are several ways to get leads for any offer you want to promote and I will give you two of the best ways I know.

This tactic is called lead generating an is something I’m fairly good at.

One way to get leads that I have yet to talk about on Quora is to do a little survey. This works because it is similar to giving away something for free in order to get leads.

Imagine this: You come across a website that says it can match you with your best chance for working at home, which is something you’ve been looking to do for months now.

In the middle of the site, there’s a button that says, “TAKE SURVEY NOW” so you click the button to begin the survey.

There are a few questions that ask stuff like, “Are you coachable,” “Are you willing to invest in your future and your family future,” and “How serious are you about working from home.”

Each of those questions is causing you to commit to your answer so when you get to the survey’s answer to your best match, you are more likely to opt-in and use that offer.

The second way of getting leads is to do a contest. This works best if you offer some sort of digital offer as the prize.

Let’s look at one of the methods I used to promote the Clickbank offer, CB Passive Income.

I offer a free 30-min coaching call where I gave away my “secret” traffic method. If I remember correctly, I got about 30 leads in the first day and 90 leads the second day.

This was back when I was blogging, so I basically had people to sign up to my mailing chimp landing page before sending them straight to CB Passive Income.

Mailing Chimp no longer allows you to promote affiliate offers, but you can use Builderall, The Online Business and Digital Marketing Platform | Builderall USA to build your funnels and landing pages.

If you do sign up using that link, you’ll also get unlimited support from me because you’ll sign-up under me and it’s basically a win-win for both of use.

If I remember correctly, out of those 120 leads I got 30 sales, which is $1,500, over a period of two weeks.