Search Results - bidirectional encoder presentation from transformers

  • Showing 1 - 11 results of 11
Refine Results
  1. 1
  2. 2

    A dual-phase deep learning framework for advanced phishing detection using the novel OptSHQCNN approach by Srikanth Meda, Vangipuram Sesha Srinivas, Killi Chandra Bhushana Rao, Repudi Ramesh, Narasimha Rao Yamarthi

    Published 2025-07-01
    “…Results In the post-deployment phase, the URL is encoded using Optimized Bidirectional Encoder Representations from Transformers (OptBERT), after which the features are extracted. …”
    Get full text
    Article
  3. 3

    Overview of deep learning and large language models in machine translation: a special perspective on the Arabic language by Sanaa Abou Elhamayed, Mohamed Nour

    Published 2025-06-01
    “…The bidirectional-encoder-representation from transformer (BERT) and LLMs are presented to utilize the big amount of textual data to learn translation patterns. …”
    Get full text
    Article
  4. 4

    Identifying Non-Functional Requirements From Unconstrained Documents Using Natural Language Processing and Machine Learning Approaches by Qais A. Shreda, Abualsoud A. Hanani

    Published 2025-01-01
    “…In our approach, features were extracted from the requirement sentences using four different natural language processing methods including statistical and state-of-the-art semantic analysis presented by Google word2vec and bidirectional encoder representations from transformers models. …”
    Get full text
    Article
  5. 5

    A hybrid BERT-BiRNN framework for mental health prediction using textual data by Muhammad Nouman, Sui Yang Khoo, M.A. Parvez Mahmud, Abbas Z. Kouzani

    Published 2025-09-01
    “…This study employs a labelled​ text dataset derived from Lyf Support app. To harness the potential of this dataset for the development of a mental health prediction tool, we propose a novel technique that utilises the bidirectional encoder representations from transformers (BERT) model to identify mental health-related text chats. …”
    Get full text
    Article
  6. 6

    A Hybrid Deep Learning Approach for Cotton Plant Disease Detection Using BERT-ResNet-PSO by Chetanpal Singh, Santoso Wibowo, Srimannarayana Grandhi

    Published 2025-06-01
    “…It is, therefore, crucial to accurately identify leaf diseases in cotton plants to prevent any negative effects on yield. This paper presents a hybrid deep learning approach based on Bidirectional Encoder Representations from Transformers with Residual network and particle swarm optimization (BERT-ResNet-PSO) for detecting cotton plant diseases. …”
    Get full text
    Article
  7. 7

    EYE-Llama, an in-domain large language model for ophthalmology by Tania Haghighi, Sina Gholami, Jared Todd Sokol, Enaika Kishnani, Adnan Ahsaniyan, Holakou Rahmanian, Fares Hedayati, Theodore Leng, Minhaj Nur Alam

    Published 2025-07-01
    “…We evaluated EYE-Llama against Llama 2, Llama 3, Meditron, ChatDoctor, ChatGPT, and several other LLMs. Using BERT (Bidirectional Encoder Representations from Transformers) score, BART (Bidirectional and Auto-Regressive Transformer) score, and BLEU (Bilingual Evaluation Understudy) metrics, EYE-Llama achieved superior scores. …”
    Get full text
    Article
  8. 8

    Comparison of Deep Learning Sentiment Analysis Methods, Including LSTM and Machine Learning by Jean Max T. Habib, A. A. Poguda

    Published 2023-11-01
    “…In this case, it is crucial for researchers to explore the possibilities of updating certain tools, either to combine them or to develop them to adapt them to modern tasks in order to provide a clearer understanding of the results of their treatment. We present a comparison of several deep learning models, including convolutional neural networks, recurrent neural networks, and long-term and shortterm bidirectional memory, evaluated using different approaches to word integration, including Bidirectional Encoder Representations from Transformers (BERT) and its variants, FastText and Word2Vec. …”
    Get full text
    Article
  9. 9

    Identification and analysis of driving factors for product evolution: A text data mining approach by Shifeng Liu, Jianning Su, Shutao Zhang, Kai Qiu, Shijie Wang

    Published 2025-07-01
    “…Traditional studies primarily rely on inductive summarization, which often faces issues of subjectivity, uncertainty, and low reliability. This research presents a method combining the Bidirectional Encoder Representations from Transformers (BERT) model and Dynamic Topic Model (DTM) to analyze the driving factors of product evolution. …”
    Get full text
    Article
  10. 10

    Detecting Chinese Disinformation with Fine–Tuned BERT and Contextual Techniques by Lixin Yun, Sheng Yun, Haoran Xue

    Published 2025-12-01
    “…Building on large language models (LLMs) like BERT (Bidirectional Encoder Representations from Transformers) provides a promising avenue for addressing this challenge. …”
    Get full text
    Article
  11. 11

    Detecting indicators of violence in digital text using deep learning by Abbas Z. Kouzani, Muhammad Nouman

    Published 2025-09-01
    “…The word embeddings extraction is implemented with the use of the bidirectional encoder representations from transformer algorithm. …”
    Get full text
    Article