docs: add User Feedback section to toolbar reading assistance instructions (#5839)

* docs: add User Feedback section to toolbar reading assistance instructions

Describes the shared feedback pattern for word cards (existing) and
translations (planned). References #5838.

* feat: allow users to flag translations

* make flag button smaller

---------

Co-authored-by: ggurdin <ggurdin@gmail.com>
This commit is contained in:
wcjord 2026-03-02 13:01:03 -05:00 committed by GitHub
parent f6d7bfa981
commit 38e908c33c
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
11 changed files with 201 additions and 69 deletions

View file

@ -77,7 +77,7 @@ The word card is the detailed view for a single token. It appears above the mess
- **Meaning** — either user-set or auto-generated L1 translation
- **Phonetic transcription** — IPA or simplified pronunciation guide
- **Emoji** — the user's personal emoji association (if set), or a picker to set one
- **Feedback button** — lets the user report bad tokenization or incorrect meanings
- **Feedback button** — lets the user flag incorrect token data (POS, meaning, phonetics, language). See §User Feedback below.
The word card is intentionally compact — it should be glanceable, not a full dictionary entry. The goal is quick recognition, not exhaustive reference.
@ -119,6 +119,27 @@ The toolbar must work within chat layout constraints:
- The overlay must survive screen rotation and keyboard appearance without losing state
- On width changes (e.g., split-screen), the overlay dismisses rather than attempting to reposition (avoids jarring layout jumps)
## User Feedback
AI-generated content in the toolbar — word card info and translations — can be wrong. Users need a lightweight way to say "this is incorrect" without leaving the toolbar flow. The pattern is the same everywhere: a small **flag icon** beside the content opens a dialog where the user describes the problem in free text. The server re-generates the content with the feedback in context and returns an improved result.
### Design Principles
- **Low friction**: One tap to flag, one text field, done. The user shouldn't need to know *what* is wrong technically — just describe it in their own words.
- **Immediate improvement**: After flagging, the UI replaces the old content with the regenerated version so the user sees the fix right away.
- **Same interaction everywhere**: Word card flagging and translation flagging look and feel identical to the user. Same icon, same dialog, same flow.
- **Auditable**: Every flag is recorded on the server with the user's identity, building a quality signal that improves future results for all users.
### Word Card Feedback (exists)
The word card already has a flag button. When tapped, the user can report issues with tokenization, meaning, phonetics, or language detection. The server figures out which fields need correction and returns updates. See [token-info-feedback-v2.instructions.md](token-info-feedback-v2.instructions.md).
### Translation Feedback (planned)
The full-text translation shown in Translate mode currently has no flag button. Add one — same icon, same dialog, same UX as word card feedback. When the user flags a bad translation, the server regenerates it with a stronger model and the user's feedback as context.
This is especially important for mixed-language and polysemous inputs where the default model gets it wrong (see [#1311](https://github.com/pangeachat/2-step-choreographer/issues/1311), [#1477](https://github.com/pangeachat/2-step-choreographer/issues/1477)). No new server endpoint is needed — the existing translation endpoint already supports feedback. See [direct-translate.instructions.md](../../2-step-choreographer/.github/instructions/direct-translate.instructions.md).
## Key Contracts
- **Overlay, not navigation.** The toolbar never pushes a route. It's a composited overlay that lives on top of the chat. Dismissal returns to the exact same chat state.

View file

@ -5358,5 +5358,6 @@
"cannotJoinBannedRoom": "Banned. Unable to join.",
"sessionFull": "Too late! This activity is full.",
"returnToCourse": "Return to course",
"returnHome": "Return home"
"returnHome": "Return home",
"translationFeedback": "Translation Feedback"
}

View file

@ -180,8 +180,8 @@ class AnalyticsPracticeDataService {
}
// Prefetch the translation
final translation = await pangeaEvent.requestRespresentationByL1();
_setAudioInfo(eventId, audioFile, translation);
final translation = await pangeaEvent.requestTranslationByL1();
_setAudioInfo(eventId, audioFile, translation.bestTranslation);
}
Future<void> _prefetchLemmaInfo(

View file

@ -134,7 +134,8 @@ class GrammarErrorTargetGenerator {
}
try {
translation ??= await event.requestRespresentationByL1();
final resp = await event.requestTranslationByL1();
translation ??= resp.bestTranslation;
} catch (e, s) {
ErrorHandler.logError(
e: e,

View file

@ -159,7 +159,7 @@ class ChoreoRecordModel {
});
bool endedWithIT(String sent) {
return includedIT && stepText() == sent;
return includedIT && !includedIGC && stepText() == sent;
}
/// Get the text at [stepIndex]

View file

@ -11,6 +11,7 @@ import 'package:sentry_flutter/sentry_flutter.dart';
import 'package:fluffychat/pangea/choreographer/choreo_record_model.dart';
import 'package:fluffychat/pangea/common/constants/model_keys.dart';
import 'package:fluffychat/pangea/common/models/llm_feedback_model.dart';
import 'package:fluffychat/pangea/events/event_wrappers/pangea_representation_event.dart';
import 'package:fluffychat/pangea/events/extensions/pangea_event_extension.dart';
import 'package:fluffychat/pangea/events/models/representation_content_model.dart';
@ -31,6 +32,7 @@ import 'package:fluffychat/pangea/text_to_speech/text_to_speech_response_model.d
import 'package:fluffychat/pangea/toolbar/message_practice/message_audio_card.dart';
import 'package:fluffychat/pangea/translation/full_text_translation_repo.dart';
import 'package:fluffychat/pangea/translation/full_text_translation_request_model.dart';
import 'package:fluffychat/pangea/translation/full_text_translation_response_model.dart';
import 'package:fluffychat/widgets/future_loading_dialog.dart';
import '../../../widgets/matrix.dart';
import '../../common/utils/error_handler.dart';
@ -505,7 +507,7 @@ class PangeaMessageEvent {
}
final translation = SttTranslationModel(
translation: res.result!,
translation: res.result!.bestTranslation,
langCode: l1Code,
);
@ -544,44 +546,70 @@ class PangeaMessageEvent {
return _sendRepresentationEvent(res.result!);
}
Future<String> requestRespresentationByL1() async {
Future<FullTextTranslationResponseModel> requestTranslationByL1({
List<LLMFeedbackModel<FullTextTranslationResponseModel>>? feedback,
}) async {
if (_l1Code == null || _l2Code == null) {
throw Exception("Missing language codes");
}
final includedIT =
(originalSent?.choreo?.endedWithIT(originalSent!.text) ?? false) &&
!(originalSent?.choreo?.includedIGC ?? true);
RepresentationEvent? rep;
if (!includedIT) {
// if the message didn't go through translation, get any l1 rep
rep = _representationByLanguage(_l1Code!);
} else {
// if the message went through translation, get the non-original
// l1 rep since originalWritten could contain some l2 words
// (https://github.com/pangeachat/client/issues/3591)
rep = _representationByLanguage(
_l1Code!,
filter: (rep) => !rep.content.originalWritten,
);
if (feedback == null) {
final includedIT =
originalSent?.choreo?.endedWithIT(originalSent!.text) == true;
RepresentationEvent? rep;
if (!includedIT) {
// if the message didn't go through translation, get any l1 rep
rep = _representationByLanguage(_l1Code!);
} else {
// if the message went through translation, get the non-original
// l1 rep since originalWritten could contain some l2 words
// (https://github.com/pangeachat/client/issues/3591)
rep = _representationByLanguage(
_l1Code!,
filter: (rep) => !rep.content.originalWritten,
);
}
if (rep != null) {
return FullTextTranslationResponseModel(
translation: rep.text,
translations: [rep.text],
source: messageDisplayLangCode,
);
}
}
if (rep != null) return rep.content.text;
final includedIT =
originalSent?.choreo?.endedWithIT(originalSent!.text) == true;
final String srcLang = includedIT
? (originalWritten?.langCode ?? _l1Code!)
: (originalSent?.langCode ?? _l2Code!);
final resp = await _requestRepresentation(
includedIT ? originalWrittenContent : messageDisplayText,
_l1Code!,
srcLang,
final text = includedIT ? originalWrittenContent : messageDisplayText;
final resp = await FullTextTranslationRepo.get(
MatrixState.pangeaController.userController.accessToken,
FullTextTranslationRequestModel(
text: text,
srcLang: srcLang,
tgtLang: _l1Code!,
userL2:
MatrixState.pangeaController.userController.userL2Code ??
LanguageKeys.unknownLanguage,
userL1: _l1Code!,
feedback: feedback,
),
);
if (resp.isError) throw resp.error!;
_sendRepresentationEvent(resp.result!);
return resp.result!.text;
_sendRepresentationEvent(
PangeaRepresentation(
langCode: _l1Code!,
text: resp.result!.bestTranslation,
originalSent: false,
originalWritten: false,
),
);
return resp.result!;
}
Future<Result<PangeaRepresentation>> _requestRepresentation(
@ -589,6 +617,7 @@ class PangeaMessageEvent {
String targetLang,
String sourceLang, {
bool originalSent = false,
List<LLMFeedbackModel<FullTextTranslationResponseModel>>? feedback,
}) async {
_representations = null;
@ -600,6 +629,7 @@ class PangeaMessageEvent {
tgtLang: targetLang,
userL2: _l2Code ?? LanguageKeys.unknownLanguage,
userL1: _l1Code ?? LanguageKeys.unknownLanguage,
feedback: feedback,
),
);
@ -608,7 +638,7 @@ class PangeaMessageEvent {
: Result.value(
PangeaRepresentation(
langCode: targetLang,
text: res.result!,
text: res.result!.bestTranslation,
originalSent: originalSent,
originalWritten: false,
),

View file

@ -12,6 +12,7 @@ import 'package:fluffychat/pages/chat/events/message_content.dart';
import 'package:fluffychat/pages/chat/events/reply_content.dart';
import 'package:fluffychat/pangea/common/utils/async_state.dart';
import 'package:fluffychat/pangea/common/widgets/error_indicator.dart';
import 'package:fluffychat/pangea/common/widgets/feedback_dialog.dart';
import 'package:fluffychat/pangea/events/extensions/pangea_event_extension.dart';
import 'package:fluffychat/pangea/events/models/pangea_token_model.dart';
import 'package:fluffychat/pangea/toolbar/layout/reading_assistance_mode_enum.dart';
@ -300,6 +301,7 @@ class OverlayMessage extends StatelessWidget {
controller: selectModeController,
style: style,
maxWidth: maxWidth,
minWidth: messageWidth ?? 0,
),
],
),
@ -313,13 +315,31 @@ class _MessageSelectModeContent extends StatelessWidget {
final SelectModeController controller;
final TextStyle style;
final double maxWidth;
final double minWidth;
const _MessageSelectModeContent({
required this.controller,
required this.style,
required this.maxWidth,
required this.minWidth,
});
Future<void> onFlagTranslation(BuildContext context) async {
final resp = await showDialog<String?>(
context: context,
builder: (context) => FeedbackDialog(
title: L10n.of(context).translationFeedback,
onSubmit: (feedback) => Navigator.of(context).pop(feedback),
),
);
if (resp == null || resp.isEmpty) {
return;
}
await controller.fetchTranslation(feedback: resp);
}
@override
Widget build(BuildContext context) {
return ListenableBuilder(
@ -356,8 +376,13 @@ class _MessageSelectModeContent extends StatelessWidget {
? controller.translationState.value
: controller.speechTranslationState.value;
return Padding(
return Container(
padding: const EdgeInsets.all(12.0),
constraints: BoxConstraints(
minHeight: 40.0,
maxWidth: maxWidth,
minWidth: minWidth,
),
child: switch (state) {
AsyncLoading() => Row(
mainAxisSize: MainAxisSize.min,
@ -371,15 +396,28 @@ class _MessageSelectModeContent extends StatelessWidget {
message: L10n.of(context).translationError,
style: style.copyWith(fontStyle: FontStyle.italic),
),
AsyncLoaded(value: final value) => Container(
constraints: BoxConstraints(maxWidth: maxWidth),
child: SingleChildScrollView(
child: Text(
value,
textScaler: TextScaler.noScaling,
style: style.copyWith(fontStyle: FontStyle.italic),
AsyncLoaded(value: final value) => Row(
spacing: 8.0,
mainAxisSize: .min,
mainAxisAlignment: .spaceBetween,
children: [
Flexible(
child: Text(
value,
textScaler: TextScaler.noScaling,
style: style.copyWith(fontStyle: FontStyle.italic),
),
),
),
if (mode == SelectMode.translate)
InkWell(
onTap: () => onFlagTranslation(context),
child: Icon(
Icons.flag_outlined,
color: style.color,
size: 16.0,
),
),
],
),
_ => const SizedBox(),
},

View file

@ -7,12 +7,14 @@ import 'package:matrix/matrix.dart';
import 'package:path_provider/path_provider.dart';
import 'package:fluffychat/pangea/analytics_misc/lemma_emoji_setter_mixin.dart';
import 'package:fluffychat/pangea/common/models/llm_feedback_model.dart';
import 'package:fluffychat/pangea/common/utils/async_state.dart';
import 'package:fluffychat/pangea/events/event_wrappers/pangea_message_event.dart';
import 'package:fluffychat/pangea/events/models/pangea_token_text_model.dart';
import 'package:fluffychat/pangea/speech_to_text/speech_to_text_response_model.dart';
import 'package:fluffychat/pangea/toolbar/message_practice/message_audio_card.dart';
import 'package:fluffychat/pangea/toolbar/reading_assistance/select_mode_buttons.dart';
import 'package:fluffychat/pangea/translation/full_text_translation_response_model.dart';
import 'package:fluffychat/widgets/matrix.dart';
class _TranscriptionLoader extends AsyncLoader<SpeechToTextResponseModel> {
@ -38,14 +40,6 @@ class _STTTranslationLoader extends AsyncLoader<String> {
);
}
class _TranslationLoader extends AsyncLoader<String> {
final PangeaMessageEvent messageEvent;
_TranslationLoader(this.messageEvent) : super();
@override
Future<String> fetch() => messageEvent.requestRespresentationByL1();
}
class _AudioLoader extends AsyncLoader<(PangeaAudioFile, File?)> {
final PangeaMessageEvent messageEvent;
_AudioLoader(this.messageEvent) : super();
@ -71,21 +65,26 @@ class _AudioLoader extends AsyncLoader<(PangeaAudioFile, File?)> {
}
}
typedef _TranslationLoader = ValueNotifier<AsyncState<String>>;
class SelectModeController with LemmaEmojiSetter {
final PangeaMessageEvent messageEvent;
final _TranscriptionLoader _transcriptLoader;
final _TranslationLoader _translationLoader;
final _AudioLoader _audioLoader;
final _STTTranslationLoader _sttTranslationLoader;
SelectModeController(this.messageEvent)
: _transcriptLoader = _TranscriptionLoader(messageEvent),
_translationLoader = _TranslationLoader(messageEvent),
_translationLoader = _TranslationLoader(AsyncIdle<String>()),
_audioLoader = _AudioLoader(messageEvent),
_sttTranslationLoader = _STTTranslationLoader(messageEvent);
ValueNotifier<SelectMode?> selectedMode = ValueNotifier<SelectMode?>(null);
FullTextTranslationResponseModel? _lastTranslationResponse;
// Sometimes the same token is clicked twice. Setting it to the same value
// won't trigger the notifier, so use the bool for force it to trigger.
ValueNotifier<(PangeaTokenText?, bool)> playTokenNotifier =
@ -109,8 +108,7 @@ class SelectModeController with LemmaEmojiSetter {
static List<SelectMode> get _audioModes => [SelectMode.speechTranslation];
ValueNotifier<AsyncState<String>> get translationState =>
_translationLoader.state;
ValueNotifier<AsyncState<String>> get translationState => _translationLoader;
ValueNotifier<AsyncState<SpeechToTextResponseModel>> get transcriptionState =>
_transcriptLoader.state;
@ -179,7 +177,7 @@ class SelectModeController with LemmaEmojiSetter {
bool get isShowingExtraContent =>
(selectedMode.value == SelectMode.translate &&
_translationLoader.isLoaded) ||
_translationLoader.value is AsyncLoaded<String>) ||
(selectedMode.value == SelectMode.speechTranslation &&
_sttTranslationLoader.isLoaded) ||
_transcriptLoader.isLoaded ||
@ -191,7 +189,7 @@ class SelectModeController with LemmaEmojiSetter {
ValueNotifier<AsyncState>? modeStateNotifier(SelectMode? mode) =>
switch (mode) {
SelectMode.audio => _audioLoader.state,
SelectMode.translate => _translationLoader.state,
SelectMode.translate => _translationLoader,
SelectMode.speechTranslation => _sttTranslationLoader.state,
_ => null,
};
@ -205,7 +203,32 @@ class SelectModeController with LemmaEmojiSetter {
playTokenNotifier.value = (token, !playTokenNotifier.value.$2);
Future<void> fetchAudio() => _audioLoader.load();
Future<void> fetchTranslation() => _translationLoader.load();
Future<void> fetchTranscription() => _transcriptLoader.load();
Future<void> fetchSpeechTranslation() => _sttTranslationLoader.load();
Future<void> fetchTranslation({String? feedback}) async {
try {
_translationLoader.value = AsyncLoading();
List<LLMFeedbackModel<FullTextTranslationResponseModel>>? feedbackModel;
if (feedback != null && _lastTranslationResponse != null) {
feedbackModel = [
LLMFeedbackModel<FullTextTranslationResponseModel>(
feedback: feedback,
content: _lastTranslationResponse!,
contentToJson: (c) => c.toJson(),
),
];
}
final resp = await messageEvent.requestTranslationByL1(
feedback: feedbackModel,
);
_lastTranslationResponse = resp;
_translationLoader.value = AsyncLoaded(resp.bestTranslation);
} catch (e) {
_translationLoader.value = AsyncError(e.toString());
}
}
}

View file

@ -14,7 +14,7 @@ import '../common/network/requests.dart';
import '../common/network/urls.dart';
class _TranslateCacheItem {
final Future<String> response;
final Future<FullTextTranslationResponseModel> response;
final DateTime timestamp;
const _TranslateCacheItem({required this.response, required this.timestamp});
@ -24,7 +24,7 @@ class FullTextTranslationRepo {
static final Map<String, _TranslateCacheItem> _cache = {};
static const Duration _cacheDuration = Duration(minutes: 10);
static Future<Result<String>> get(
static Future<Result<FullTextTranslationResponseModel>> get(
String accessToken,
FullTextTranslationRequestModel request,
) {
@ -38,7 +38,7 @@ class FullTextTranslationRepo {
return _getResult(request, future);
}
static Future<String> _fetch(
static Future<FullTextTranslationResponseModel> _fetch(
String accessToken,
FullTextTranslationRequestModel request,
) async {
@ -60,12 +60,12 @@ class FullTextTranslationRepo {
return FullTextTranslationResponseModel.fromJson(
jsonDecode(utf8.decode(res.bodyBytes)),
).bestTranslation;
);
}
static Future<Result<String>> _getResult(
static Future<Result<FullTextTranslationResponseModel>> _getResult(
FullTextTranslationRequestModel request,
Future<String> future,
Future<FullTextTranslationResponseModel> future,
) async {
try {
final res = await future;
@ -77,7 +77,9 @@ class FullTextTranslationRepo {
}
}
static Future<String>? _getCached(FullTextTranslationRequestModel request) {
static Future<FullTextTranslationResponseModel>? _getCached(
FullTextTranslationRequestModel request,
) {
final cacheKeys = [..._cache.keys];
for (final key in cacheKeys) {
if (DateTime.now().difference(_cache[key]!.timestamp) >= _cacheDuration) {
@ -90,7 +92,7 @@ class FullTextTranslationRepo {
static void _setCached(
FullTextTranslationRequestModel request,
Future<String> response,
Future<FullTextTranslationResponseModel> response,
) => _cache[request.hashCode.toString()] = _TranslateCacheItem(
response: response,
timestamp: DateTime.now(),

View file

@ -1,4 +1,8 @@
import 'package:collection/collection.dart';
import 'package:fluffychat/pangea/common/constants/model_keys.dart';
import 'package:fluffychat/pangea/common/models/llm_feedback_model.dart';
import 'package:fluffychat/pangea/translation/full_text_translation_response_model.dart';
class FullTextTranslationRequestModel {
final String text;
@ -9,6 +13,7 @@ class FullTextTranslationRequestModel {
final bool? deepL;
final int? offset;
final int? length;
final List<LLMFeedbackModel<FullTextTranslationResponseModel>>? feedback;
const FullTextTranslationRequestModel({
required this.text,
@ -19,6 +24,7 @@ class FullTextTranslationRequestModel {
this.deepL = false,
this.offset,
this.length,
this.feedback,
});
Map<String, dynamic> toJson() => {
@ -30,6 +36,8 @@ class FullTextTranslationRequestModel {
ModelKey.deepL: deepL,
ModelKey.offset: offset,
ModelKey.length: length,
if (feedback != null)
ModelKey.feedback: feedback!.map((f) => f.toJson()).toList(),
};
// override equals and hashcode
@ -45,7 +53,8 @@ class FullTextTranslationRequestModel {
other.userL1 == userL1 &&
other.deepL == deepL &&
other.offset == offset &&
other.length == length;
other.length == length &&
ListEquality().equals(other.feedback, feedback);
}
@override
@ -57,5 +66,6 @@ class FullTextTranslationRequestModel {
userL1.hashCode ^
deepL.hashCode ^
offset.hashCode ^
length.hashCode;
length.hashCode ^
ListEquality().hash(feedback);
}

View file

@ -2,25 +2,31 @@ import 'package:fluffychat/pangea/common/constants/model_keys.dart';
class FullTextTranslationResponseModel {
final List<String> translations;
final String translation;
final String source;
final String? deepL;
const FullTextTranslationResponseModel({
required this.translation,
required this.translations,
required this.source,
required this.deepL,
});
factory FullTextTranslationResponseModel.fromJson(Map<String, dynamic> json) {
return FullTextTranslationResponseModel(
translation: json['translation'] as String,
translations: (json["translations"] as Iterable)
.map<String>((e) => e)
.toList()
.cast<String>(),
source: json[ModelKey.srcLang],
deepL: json['deepl_res'],
);
}
String get bestTranslation => deepL ?? translations.first;
Map<String, dynamic> toJson() => {
'translation': translation,
'translations': translations,
ModelKey.srcLang: source,
};
String get bestTranslation => translation;
}