Complex Big Data systems in modern organisations are progressively becoming attack targets by existing and emerging threat agents. Elaborate and specialised attacks will increasingly be crafted to exploit vulnerabilities and weaknesses. With the ever-increasing trend of cybercrime and incidents due to these vulnerabilities, effective vulnerability management is imperative for modern organisations regardless of their size. However, organisations struggle to manage the sheer volume of vulnerabilities discovered on their networks. Moreover, vulnerability management tends to be more reactive in practice. Rigorous statistical models, simulating anticipated volume and dependence of vulnerability disclosures, will undoubtedly provide important insights to organisations and help them become more proactive in the management of cyber risks. By leveraging the rich yet complex historical vulnerability data, our proposed novel and rigorous framework has enabled this new capability. By utilising this sound framework, we initiated an important study on not only handling persistent volatilities in the data but also further unveiling multivariate dependence structure amongst different vulnerability risks. In sharp contrast to the existing studies on univariate time series, we consider the more general multivariate case striving to capture their intriguing relationships. Through our extensive empirical studies using the real world vulnerability data, we have shown that a composite model can effectively capture and preserve long-term dependency between different vulnerability and exploit disclosures. In addition, the paper paves the way for further study on the stochastic perspective of vulnerability proliferation towards building more accurate measures for better cyber risk management as a whole.