QStringTokenizer::toContainer(): allow more types of target containers

The previous constraint called for the value_type of the container to
exactly match the value_type of the tokenizer, which means
toContainer() could only ever work on containers of views. But there
is value in allowing QStringList, even though it works only on QL1S
needles (QStringView -> QString isn't implicit). But users may have
other types that for better or worse implicitly convert from views, so
we shouldn't over-constrain the function.

[ChangeLog][QtCore][QStringTokenizer] toContainer() now works on
containers whose value_type can be constructed from the tokenizer's
value_type. It no longer requires an exact match.

Fixes: QTBUG-101702
Change-Id: Ie384cd1c4b51eaa57675f2a014141ceec8651c81
Reviewed-by: Fabian Kosmale <fabian.kosmale@qt.io>
Reviewed-by: Qt CI Bot <qt_ci_bot@qt-project.org>
This commit is contained in:
Marc Mutz 2022-03-15 07:46:03 +01:00
parent 6eda249402
commit f3c340c276
2 changed files with 14 additions and 1 deletions

View File

@ -292,7 +292,7 @@ class QStringTokenizer
using if_haystack_not_pinned = typename if_haystack_not_pinned_impl<Container, HPin>::type;
template <typename Container, typename Iterator = decltype(std::begin(std::declval<Container>()))>
using if_compatible_container = typename std::enable_if<
std::is_same<
std::is_convertible<
typename Base::value_type,
typename std::iterator_traits<Iterator>::value_type
>::value,

View File

@ -145,6 +145,19 @@ void tst_QStringTokenizer::toContainer() const
auto v = tok.toContainer();
QVERIFY((std::is_same_v<decltype(v), QList<QLatin1String>>));
}
// QLatin1String value_type into QStringList
{
auto tok = qTokenize(QLatin1String{"a,b,c"}, u',');
QStringList result;
tok.toContainer(result);
QCOMPARE(result, QStringList({"a", "b", "c"}));
}
// QLatin1String value_type into QStringList: rvalue overload
{
QStringList result;
qTokenize(QLatin1String{"a,b,c"}, u',').toContainer(result);
QCOMPARE(result, QStringList({"a", "b", "c"}));
}
}
QTEST_APPLESS_MAIN(tst_QStringTokenizer)