Update For Medicare Part B Coinsurance For Insulin

Starting July 1, 2023, Part B coinsurance for a month’s supply of insulin used in an insulin pump covered under the DME benefit can’t exceed $35.

The Centers for Medicare & Medicaid Services (CMS) will adjust payments to suppliers and pharmacies to account for the balance of the reduced coinsurance. Suppliers will continue to get the Medicare payment amount for the insulin (average sales price plus 6%) minus any applicable coinsurance, which is capped at $35 for a month’s supply.

Don’t bill for supplies of insulin for July or subsequent months before July 2023. CMS will complete the system updates to make sure patients aren’t charged more than the $35 maximum allowed for the month of July. Your DME Medicare Administrative Contractor will also educate you about billing during the May – June transition period.

CMS is adding 2 new modifiers to the April 2023 HCPCS quarterly update file:

  • JK – Short Descriptor: Drug 1-month supply or less; Long Descriptor: One month supply or less of drug/biological
  • JL – Short Descriptor: Drug 3-month supply; Long Descriptor: Three month supply of drug/biological

Split Billing:

  • Before July 2023: For “from date of service” in May or June 2023, don’t bill a 3-month supply of insulin. Instead, bill a 1-month supply of insulin with the JK modifier.
  • Starting July 2023: For “from date of service” in July and later, bill a 3-month supply of insulin with the JL modifier or a 1-month supply with the JK modifier.

Source: U.S. Centers for Medicare & Medicaid Services (CMS)

About 3Gen
3Gen Consulting has always been in the forefront when it comes to adding value. We always believe in setting our own standards. What sets us apart is the investment that we make in our people, processes and innovation to provide you with market leading healthcare revenue cycle management services. We work as an extension of our clients’ teams by focusing on their key challenges, aligning with their culture and delivering the best results.

If you’re interested in finding the right medical billing and coding services partner, contact us today.

HHS Renews COVID-19 Public Health Emergency Until May

As a result of the continued consequences of the Coronavirus Disease 2019 (COVID-19) pandemic, Secretary of Health and Human Services, Xavier Becerra, renewed the COVID-19 public health emergency (PHE), effective February 11, 2023, for another 90 days.

Source: U.S. Department of Health and Human Services (HHS)

About 3Gen
3Gen Consulting has always been in the forefront when it comes to adding value. We always believe in setting our own standards. What sets us apart is the investment that we make in our people, processes and innovation to provide you with market leading healthcare revenue cycle management services. We work as an extension of our clients’ teams by focusing on their key challenges, aligning with their culture and delivering the best results.

If you’re interested in finding the right medical billing and coding services partner, contact us today.

Are You Moving Too Early on AI in Medical Coding Services?

Are You Moving Too Early on AI in Medical Coding Services?

Artificial Intelligence (AI) solutions have been hyped up in everything from customer service to disease diagnosis – and the healthcare revenue cycle hasn’t been exempt.

You’ve likely heard about how AI is a great fit for the healthcare revenue cycle – revenue cycle is a treasure trove of data and workflows that require manual input. Many healthcare leaders have looked specifically to medical coding solutions, asking whether AI could be beneficial there.

While AI is a tempting solution, anyone considering applying AI to their medical coding services should take a step back and consider the idea that while AI might show amazing promise, it might be a bit too early to consider for the healthcare revenue cycle. This means that making a decision too soon might be difficult to reverse.

Data Quality Is a Challenge
An ancient truth of data applies even more to AI – “garbage in, garbage out”. This lesson has emerged clearly in clinical analytics, where organizations have learned that their results depend on clinicians feeding clean and comprehensive data into their EHRs. They’ve learned that if information was entered incorrectly, it flows to other people using the data, comprising data quality and record integrity [1].

This means that the same issues with inaccuracies that you might have had in the past will persist even under an AI solution. This includes inaccurate information, but also gaps in coding data you might not yet have resolved.

No matter how good your AI models are, you won’t get the results that you’re looking for if your data quality isn’t where it needs to be – no matter what quantity of data you feed into it. Any organization evaluating AI will need to answer some tough questions about their data governance policies and strategies before even considering investing in an AI solution. Skip this step at your own risk.

Trust Should Be a Priority
AI has a particular problem with trust and it’s especially acute in healthcare. Consider some recent commentary on “the black box problem” in AI and machine learning.

“There is much confusion about the black box problem of AI. Many AI algorithms are not explainable, even by the programmers who created them, as the code evolves over several virtual generations and ends up as a complex code whose working is opaque to us humans. We are unable to see the ‘rough work,’ only the final answer. Thus, especially in the critical field of healthcare, there is a big doubt whether we can trust AI.” [2]

It’s worth asking if you can find a solution you trust enough to implement in your revenue cycle.

Security Is an Issue
Healthcare as an industry has led the pack in identity theft for years now, and while many organizations have made significant progress in their cybersecurity efforts, hackers and cyber criminals still find healthcare extremely appealing. For any organization that is still struggling with security risks or that has just found their footing, AI solutions for medical coding could derail present or future progress.

AI runs on your data networks, which means it’s inherently vulnerable to security risks. Consider its differences from traditional software. Vulnerabilities in traditional software generally flow back to issues with design and source code. In AI, this extends to images, text, audio files, and whatever data that was used to train and run models. Your security team will need to be aware of an entirely new level of threat to keep your data safe from malactors [3].

You’ll Still Need People
The way some people discuss AI in the revenue cycle, you’d think it was as simple as flipping a switch and enjoying the benefits of increased efficiency and reduced long-term costs. The truth about any AI solution, though, is that while they’ve come a long way, AI still needs some level of human surveillance. While AI can reduce your need for certain positions, it will also launch you into a new phase of hiring and recruitment.

You will need an AI team to manage your projects. This team will require a range of skills and backgrounds and need to operate at high levels of collaboration for you to get the results you’re looking for. Gartner has estimated that half of IT leaders will struggle with any AI project they have beyond the conceptual level – partly because the data scientists they have are doing too much [4]. This is largely because of a talent shortage in other roles. Consider that AI itself is dealing with a labor, skills, expertise, and knowledge shortage – an issue that stands out as the number one barrier to adoption for many organizations.

Ethical Issues Still Exist
Ethics are paramount at all levels in healthcare, revenue cycle included. Unfortunately, incorporating ethics and morality into AI is still a struggle. This is especially true as the importance of social determinants of health (SDoH) continues to grow for healthcare decision makers. AI still might not be able to make considerations that are specific to certain situations.

Risk of Failure
Perhaps the greatest risk of jumping too early into AI for a medical coding solution is what happens if the solution doesn’t work out.

As things stand now, if your AI solution fails, you’re left with little recourse, either experiencing a severe interruption in your revenue cycle while you rebuild or having to quickly hire and train new human staff or possibly even scrambling to sort through medical coding companies and medical coding solutions to help you get moving again.

While we believe in the potential for AI in medical coding services, we also believe that now might be too early for most organizations to take a step. To learn what medical coding services could be a smarter choice for you as AI finds its footing, start here.

References
[1] J. Bresnick, “What are the barriers to clinical analytics, big data success?,” Health IT Analytics, 30 July 2014. Available: https://healthitanalytics.com/news/what-are-the-barriers-to-clinical-analytics-big-data-success.
[2] J. D. Akkara and A. Kuriakose, “Commentary: Is human supervision needed for artificial intelligence?,” Indian J Ophthalmol, vol. 70, no. 4, p. 1138–1139, 2022.
[3] B. Dickson, “Machine learning security vulnerabilities are a growing threat to the web, report highlights,” PortSwigger Ltd., 30 June 2021. Available: https://portswigger.net/daily-swig/machine-learning-security-vulnerabilities-are-a-growing-threat-to-the-web-report-highlights.
[4] L. Goasduff, “How to Staff Your AI Team,” Gartner, 15 December 2020. Available: https://www.gartner.com/smarterwithgartner/how-to-staff-your-ai-team.

Get In Touch!
close slider

    Get In Touch!