Author: Chris Sutherland, CSutherland@jackhenry.com
In my last blog entry, we discussed “Virtualization…so what is it?” Virtualization is the creation of a virtual (rather than actual) version of your IT environment’s operating system, a server, or resources. When discussing virtualization, one of the most commonly asked questions is, “What can really be virtualized?” With time, this question is becoming more easily answered. In fact, in today’s ever-changing world of virtualization there is very little that cannot be virtualized. The technology is so robust that almost everything can -- and in my opinion should -- be virtualized. With the proper configuration of physical hardware and costs of “Tier 1” storage, you not only can save hardware costs by making the change from physical to virtual, but you also can benefit from having a more highly-available solution for your institution.
From an installation standpoint, the most commonly virtualized servers that we see working very well include: Domain Controllers, Exchange Servers, File Servers, Print Servers, and Application Servers, including terminal servers. When you really think about it, an average physical server uses less than 10% of its total processing power, yet vendors now tell you, “Don’t install this on this type of server.” (For example, in the past many small organizations would combine a Domain Controller with an Exchange Server, yet this is not recommended by best practices. Plus, certain applications like IIS and Terminal Services, are also not recommended for Domain Controllers.) For the purpose of an example, you can figure that a small business would need a minimum of four servers to run some basic applications. Now let’s consider migrating the example to a virtualized scenario having two physical servers and shared storage, thereby giving you the ability to run all of these servers on shared resources. In this case, you now have the capability -- should hardware problems arise -- to stay up and running (vs. having to wait on replacement parts before you can be back in business), which helps facilitate your business continuity objectives.
So then, the next question anticipated from a ProfitStars® client would be, “Which of your products can be run in a virtual environment?” Again, from our company standpoint, the better question is, “What cannot be virtualized?” The good news is that at the time of this blog there are a very limited number of products not supported for use in a virtual environment. That tells us that most all of the software provided by ProfitStars®, in most cases, can be virtualized. For more information on virtualization with ProfitStars products you can speak directly with your account manager, sales person, or contact us directly at Matrix Network Services at SalesMatrix@jackhenry.com.
Author: Jackie Marshall, JaMarshall@jackhenry.com
Many community bankers get the “chills” when the term social media is brought up in conversation. But why does one of the most exciting communication tools ever created conjure up such emotions of fear and trepidation? It’s for the same reason that people fear a lot of things -- it is fear of the unknown, plain and simple. And, to make matters worse, financial institutions (FIs) are inundated with these types of intimidating media messages today: “Jump on the social media bandwagon or miss out!”, or, “If you don’t have a Facebook page, just close the door.” These kinds of messages can be almost as scary as the FDIC alerts some of us receive on Friday afternoons announcing the next wave of bank acquisitions and closings.
Over the last several months, I have been immersed in the subject of social media, speaking with bankers at recent JHA and ProfitStars user group meetings, hosting my own social media compliance webinar events, writing articles on the subject, and working one-on-one with FIs to customize social media strategies. During this time, I have come to the realization that it’s not fear related to fraud that is causing the anxiety, nor is it the risk of what employees are stating in their personal Facebook posts, nor the fact that FIs don’t have the personnel to manage this new communication channel. What has become clear is that many senior managers at FIs are not social media users themselves; in fact, many typically don’t even have a personal Facebook page. It can be difficult to relate to social media unless you are a user yourself and can personally appreciate the value of social media communication. Every FI needs a Social Media Champion, someone who is excited about and interested in social media opportunities and will look for ways to keep the FI’s brand and messaging fresh and dynamic. But, even if you have such an individual on board, the channel won’t move downstream unless key stakeholders in the FI (yourself included) are regular users. In a nutshell, if you don’t visit your FI’s Facebook page, your customers probably won’t either.
So, what’s the magic formula for capitalizing on this dynamic new communication channel? Take the time to learn about and embrace this new, unique and dynamic form of communication. The amount of time you personally spend exploring and experiencing social media may directly correlate to the benefit that your FI derives from it.
Author: Dan Roderick, email@example.com
When considering the purchase of a loan pricing solution, the topic of assumptions is always discussed at length. Great attention is often focused on the “accuracy” and “precision” of these assumptions and the sources of information. Let’s consider the real benefit of bringing a consistent, disciplined approach to commercial loan pricing. Primarily, the goal is to allow you to easily benchmark your loan pricing process over time.
Although a thorough understanding of the assumptions included in any model is fundamental, it is more important to consider that from a pricing perspective, assumptions need to be reasonable and to be applied consistently. All of the assumptions and all output of any pricing solution are relative and ideally should be designed to reflect industry norms. For example, cost assumptions should be intended to reflect industry average costs rather than a particular bank’s actual cost. The point here is that in order to ensure your bank remains competitive and that your lenders are pricing on a “level playing field,” you are better served looking at the pricing equation from an “industry” perspective, rather than from an internally focused perspective. To take this a step further, if your bank has higher costs relative to your competitors, are your borrowers willing to pay more to cover those costs? Conversely, if your bank has below average costs, do you pass those savings along to your borrowers or to your shareholders?
To use an analogy, a pricing methodology is like a medical test. Often, medical tests are conducted in order to set a baseline for a patient, basically to determine what’s “normal” for that individual. Then, future test results are interpreted relative to that baseline. It should be the same for any pricing solution. Every institution should establish a “baseline” by running every commercial loan they book, often above a minimum loan amount threshold, over the initial 60 days following the implementation. Future results can then be interpreted relative to that baseline.
What about loan loss expense assumptions? What would be needed, if one were to try to derive this expense empirically is industry loss data on loans originally rated a 1, 2, 3, etc. If it was determined that the risk premium differentials by risk rating were fairly insignificant, would any FI be satisfied to only charge customers that insignificant premium, or would we want to try to get more if the market would allow? Naturally. Alternatively, if you were to determine that the premium needed was higher than the market will bear, would you curtail your commercial lending activities? Probably not.
Empirically derived loss data would not get to the root of the real question we should be asking ourselves which is “How much of a premium are customers with a greater risk profile willing to pay versus customers with a lower risk profile?” In other words, what will the market bear? This differential may be greater than what would be required simply to cover the loss expense.
Test any pricing solution against current results and adjust the starting point ROE targets to ensure they “raise the bar” to the greatest extent possible to maximize net interest income and bottom line.
Author: Lee Wetherington, LWetherington@profitstars.com
So far, the biggest question of 2011 has been singular: what are financial institutions to do about the $10B in fee income threatened by Durbin? The conventional answers have centered upon the fate of free checking. Without enough debit interchange to subsidize free checking, the reasoning goes, financial institutions must start charging fees or raising balance/card usage requirements, perhaps even a combination thereof.
These alternatives, however, come with their own threats. Will customers balk at new fees or restrictive conditions on checking? Will 60% indeed immediately switch to another financial institution—as recent research suggests? And is there any alternative to recoup the $10B without rankling the clientele and triggering mass defections?
Well, yes. There are several alternatives, but here’s the bottom line: instead of levying new charges on fee-sensitive consumers, financial institutions should first (1) incent consumers to use online and mobile channels that are more convenient for them and much more cost efficient for the financial institution, and (2) finally outfit small-to-medium sized businesses (SMBs) with the services they need and want, and for which they are happily willing to pay.
Let’s do the numbers. Just how much cost savings can financial institutions realize by converting offline customers to online banking and online customers to more intensive usage of services like online bill payment? According to Javelin Strategy & Research, the potential savings is $8.3B industry wide, or $167 per customer per year. How is this possible? Compare the average in-person branch transaction cost of $4.25 with the average mobile transaction cost of 8 cents, and the math becomes clear.
But how can you incent more online usage to realize these substantial cost savings? For one, by giving customers a new and unique reason to log on. In 2011, that reason will be merchant funded rewards indexed transaction-by-transaction in the online statement. Customers will receive targeted rewards/discounts from merchants with whom they already have relationships, and financial institutions will receive a share of the revenue generated for the merchant. Cost-efficient channel conversion meets new fee income alternative for the financial institution. It’s a “two-fer.”
But the fee-income opportunities don’t end there. Through the marketing of prepaid and credit cards, financial institutions can recoup half of the anticipated lost debit interchange revenue (due to Durbin) without charging additional fees for checking accounts or debit card transactions. That’s according to excellent analysis by Aite Group’s Ron Shevlin. So that’s $5B back in the coffer.
Currently, only 10% of SMBs have remote deposit capture (RDC), yet 35% are willing to pay for it. According to Celent, if deployments grew to 20% of SMBs, financial institutions could gain another $720M in fee revenue.
So, we’re already at $14B, and I’m just getting warmed up. That’s a $4B net gain post Durbin.
And if Durbin gets tabled or goes away altogether? Just consider this $14B my personal gift to you. If you don’t need it, please load it onto a prepaid card and send to my attention.