We present in this paper an inclusive application that allows visually impaired people to interact socially with other individuals in leisure activities such as card games. For this, we built a generalizable model that was trained for the detection and recognition of playing cards by using convolutional neural network through deep learning. A system called Smart Assistant was designed and implemented based on TensorFlow’s object detection API. At a predefined location, a digital camera is used to detect cards in real-time. Then, these detected cards are sent to the classifier. After the classification, the SAPI Text to Speech Synthesis API (TTS) is used to convert the labels of recognized cards (in text format) to speech output. Experiments show that in real game situations, the application can identify and classify cards with high assertiveness.