Subscribe to DSC Newsletter
  1. What types of data structures are routinely used for in-memory real time transaction scoring? I've used doubly circular linked lists to store (say) 20 most recen transactions with time stamp and other attributes, per merchant / per customer.
  2. What kind of metrics work well in this context? Amang many metrics, I've used last transaction or time to 5-th previous transaction.
  3. Do you use a lot of rather small lookup tables that you can upload in memory, to store historical data, such as merchant summary statistics broken down per day, for the last 3 months (one entry per merchant per day)?
  4. How do you optimize server performance? For instance, at 2am, when the volume of transactions is 5 times lower than at peak time, do you use the analytic servers for other tasks, such as end-of-day re-scoring?
  5. At peak time (severe peaks), do you use a simplified model that requires less memory, if you lack bandwidth?
  6. Have anybody used the Hadoop environment to feed into a true real time processing system (that is, with no latency), such as credit card processing?
  7. For data science ROI to be positive, should advanced analytics / data science costs (in terms of people, extra hardware and software) represent less than 10% of the cost of general computer architecture (servers, engineers, basic data processing and reporting)? Is there a magic number, and if it is not 10%, what would it be?

Views: 248

Reply to This

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service