top of page




We are a group with more than 10 years of experience with a high capacity for innovation and creativity, driven by challenges and passion for what we do. 

Focused on generating value and knowledge, applying data analytics and high performance computing techniques that allow us to make better decisions in less time, working as a team with our clients to deliver 110%.


We design solutions for our clients integrating data analytics techniques that ease strategic decision making in less time, achieving the optimization of their processes and resources.


To be the leading company in consultancy and implementation of data analytics solutions at a regional level, becoming an agent that generates value and innovation for our environment. 


  • Innovation and Creativity

  • ​​Teamwork

  • Passion for challenges

  • Knowledge

  • 110%

Ancla 1



In its most basic form, from this data you can infer things that have happened in a process, such as in health services, how many patients were admitted on last December?, how many returned in the first 30 days?, how many acquired an infection or suffered from a hospital safety error?.

Here we find the ability to quantify and clarify events that occurred, resources consumed and report them in a clear way for interpretation in graphs, for example, helping in the management of a population, comparing against government expectations, or identifying areas to improve their quality measures.

These are the first steps to convert data into knowledge and from these we learn to continue with the following value scales.

Although it is a conceptually elementary level, it generates value in the organization and for many of them this type of analytics is not within their reach, either because they do not have the suitable infrastructure, or because they have not implemented programs, projects and capabilities that allow them to solve technical problems such as information extraction.


When you have a sufficiently complete and accurate descriptive and diagnostic analysis, you can acquire the ability to forecast what might happen in the future.

Organizations seek evidence-based ways to reduce unnecessary costs and try to avoid, for example, consequences of errors or adverse events that can be prevented.

Typically, access to real-time data is required to enable agile decision making, demanding specific and maybe robust infrastructures, integrating the devices that generate data to provide up-to-date information.

Organizations that have overcome the initial stages and hurdles are doing incredible things, including for their own financial health, by predicting and identifying risks, increasing vigilance and helping to make decisions.

Advanced decision support by cognitive computing engines, natural language processing and text analytics can even help identify information that might otherwise be ignored.


At the last level, this question is solved: How can we make things happen?, the answer would be based on the knowledge generated to provide the ability to do something about upcoming events.

Prescriptive analytics not only predicts what is likely to happen, it also actively suggests the best way to avoid or mitigate a negative circumstance, and is becoming a reality with the incursion of the internet of things (IoT).

The future of analytics is almost limitless, and while many are still trying to claw their way  with nearly unusable historical data, others are moving forward using data science tools as the answer to critical situations with significant advances in quality, timeliness, efficiency and effectiveness.



For more than 2 years, Comfamiliar Risaralda and BIT DATA Healthcare S.A.S. joined forces to develop a project that would allow the debugging and cleaning of data associated with clinical and economic management, which togeter with big data modeling techniques were used to stablish  high reliability estimates that indicate appropriate decisions with the purpose of improving clinical and economic efficiency of health services offered by Comfamiliar Risaralda.

For this purpose, data extraction, transformation and loading (ETL) was performed on the cardiovascular and oncological cohorts to construct models that allow the identification of factors related to the risk of complications generated in patients with high-impact diagnoses.

Then, profiles related to the economic health of the institution were addressed, with models that help identify the variables involved in efficiency, effectiveness and performance.

Based on these models, the potential consumption and health results of patients treated at the institution are estimated, and by means automatic evaluation using explanation and interpretation models, the impacts of the variables in the predictive models were identified in search of the best results for the health of patients, generating the best rates of positive results and maintaining an optimal use of economic resources.

Finally, by establishing differential impacts, in terms of costs, between the models with positive and negative outcomes, a savings target was defined for the development of the project

These results establish the presence of variables involved in the generation of high efficiency and effectiveness processes clinical.

Copia de como lo hacemos.png

A team of data scientists and engineers from our organization, together with stakeholders and process experts in the client's organization, will drive your intuitions into assets with measurable value through a consulting process for the generation of analytics products ranging from various impacts and scales.

Analytics products can take the form of dashboards, models, predictions, forecasts, prescriptions, estimates, summary tables and visualizations, generated and used through queries and on-demand requests according to the potential of the data.

We use and build efficient, scalable and high-performance analytical tools based on massive data provided by the organization with technologies of high scientific and technological value, e.g. scientific and reproducible programming environments. (e.g. jupyter notebooks).​​

We have the ability to work with structured and unstructured data, such as relational database tables, text, social networks, geographic data, audio and images.

We implement agile development processes with continuous improvement and project management adjusted to PMI guidelines, supported by DevOps and MLOps tools with repositories for version control and collaborative work.

We apply philosophies considered in various ethical frameworks for artificial intelligence during the integration of models that allow pattern discovery, hand in hand with practices and guidelines on information security and data management that best suit our stakeholders needs and requirements.

During the implementation of the data management and ETL stages, we rely on python, R, pandas, hadoop and data extraction and formatting libraries, such as spacy and nltk for text management.

In the descriptive stages, we use exploratory graphics, statistics, linear algebra and classical mathematics on the transformed data that allow an understanding and connection with the information, using technologies such as matplotlib, seaborn,pandas and statsmodel.

For projects requiring diagnostic analytics, correlations are computed using sklearn and statsmodel.​​

In predictive and prescriptive stages we use machine learning models, such as neural networks, decision trees, support vector machines or regression models, built on tools such as tensorflow, pytorch and sklearn that allow automated classification, estimation, forecasting and prediction tasks.

bottom of page